Dec  1 03:35:57 np0005540741 kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Dec  1 03:35:57 np0005540741 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec  1 03:35:57 np0005540741 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  1 03:35:57 np0005540741 kernel: BIOS-provided physical RAM map:
Dec  1 03:35:57 np0005540741 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec  1 03:35:57 np0005540741 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec  1 03:35:57 np0005540741 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec  1 03:35:57 np0005540741 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec  1 03:35:57 np0005540741 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec  1 03:35:57 np0005540741 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec  1 03:35:57 np0005540741 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec  1 03:35:57 np0005540741 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec  1 03:35:57 np0005540741 kernel: NX (Execute Disable) protection: active
Dec  1 03:35:57 np0005540741 kernel: APIC: Static calls initialized
Dec  1 03:35:57 np0005540741 kernel: SMBIOS 2.8 present.
Dec  1 03:35:57 np0005540741 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec  1 03:35:57 np0005540741 kernel: Hypervisor detected: KVM
Dec  1 03:35:57 np0005540741 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec  1 03:35:57 np0005540741 kernel: kvm-clock: using sched offset of 3487215260 cycles
Dec  1 03:35:57 np0005540741 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec  1 03:35:57 np0005540741 kernel: tsc: Detected 2800.000 MHz processor
Dec  1 03:35:57 np0005540741 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec  1 03:35:57 np0005540741 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec  1 03:35:57 np0005540741 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec  1 03:35:57 np0005540741 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec  1 03:35:57 np0005540741 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec  1 03:35:57 np0005540741 kernel: Using GB pages for direct mapping
Dec  1 03:35:57 np0005540741 kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Dec  1 03:35:57 np0005540741 kernel: ACPI: Early table checksum verification disabled
Dec  1 03:35:57 np0005540741 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec  1 03:35:57 np0005540741 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  1 03:35:57 np0005540741 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  1 03:35:57 np0005540741 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  1 03:35:57 np0005540741 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec  1 03:35:57 np0005540741 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  1 03:35:57 np0005540741 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  1 03:35:57 np0005540741 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec  1 03:35:57 np0005540741 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec  1 03:35:57 np0005540741 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec  1 03:35:57 np0005540741 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec  1 03:35:57 np0005540741 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec  1 03:35:57 np0005540741 kernel: No NUMA configuration found
Dec  1 03:35:57 np0005540741 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec  1 03:35:57 np0005540741 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Dec  1 03:35:57 np0005540741 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec  1 03:35:57 np0005540741 kernel: Zone ranges:
Dec  1 03:35:57 np0005540741 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec  1 03:35:57 np0005540741 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec  1 03:35:57 np0005540741 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec  1 03:35:57 np0005540741 kernel:  Device   empty
Dec  1 03:35:57 np0005540741 kernel: Movable zone start for each node
Dec  1 03:35:57 np0005540741 kernel: Early memory node ranges
Dec  1 03:35:57 np0005540741 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec  1 03:35:57 np0005540741 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec  1 03:35:57 np0005540741 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec  1 03:35:57 np0005540741 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec  1 03:35:57 np0005540741 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec  1 03:35:57 np0005540741 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec  1 03:35:57 np0005540741 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec  1 03:35:57 np0005540741 kernel: ACPI: PM-Timer IO Port: 0x608
Dec  1 03:35:57 np0005540741 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec  1 03:35:57 np0005540741 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec  1 03:35:57 np0005540741 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec  1 03:35:57 np0005540741 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec  1 03:35:57 np0005540741 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec  1 03:35:57 np0005540741 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec  1 03:35:57 np0005540741 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec  1 03:35:57 np0005540741 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec  1 03:35:57 np0005540741 kernel: TSC deadline timer available
Dec  1 03:35:57 np0005540741 kernel: CPU topo: Max. logical packages:   8
Dec  1 03:35:57 np0005540741 kernel: CPU topo: Max. logical dies:       8
Dec  1 03:35:57 np0005540741 kernel: CPU topo: Max. dies per package:   1
Dec  1 03:35:57 np0005540741 kernel: CPU topo: Max. threads per core:   1
Dec  1 03:35:57 np0005540741 kernel: CPU topo: Num. cores per package:     1
Dec  1 03:35:57 np0005540741 kernel: CPU topo: Num. threads per package:   1
Dec  1 03:35:57 np0005540741 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec  1 03:35:57 np0005540741 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec  1 03:35:57 np0005540741 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec  1 03:35:57 np0005540741 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec  1 03:35:57 np0005540741 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec  1 03:35:57 np0005540741 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec  1 03:35:57 np0005540741 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec  1 03:35:57 np0005540741 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec  1 03:35:57 np0005540741 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec  1 03:35:57 np0005540741 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec  1 03:35:57 np0005540741 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec  1 03:35:57 np0005540741 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec  1 03:35:57 np0005540741 kernel: Booting paravirtualized kernel on KVM
Dec  1 03:35:57 np0005540741 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec  1 03:35:57 np0005540741 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec  1 03:35:57 np0005540741 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec  1 03:35:57 np0005540741 kernel: kvm-guest: PV spinlocks disabled, no host support
Dec  1 03:35:57 np0005540741 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  1 03:35:57 np0005540741 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Dec  1 03:35:57 np0005540741 kernel: random: crng init done
Dec  1 03:35:57 np0005540741 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec  1 03:35:57 np0005540741 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec  1 03:35:57 np0005540741 kernel: Fallback order for Node 0: 0 
Dec  1 03:35:57 np0005540741 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec  1 03:35:57 np0005540741 kernel: Policy zone: Normal
Dec  1 03:35:57 np0005540741 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec  1 03:35:57 np0005540741 kernel: software IO TLB: area num 8.
Dec  1 03:35:57 np0005540741 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec  1 03:35:57 np0005540741 kernel: ftrace: allocating 49313 entries in 193 pages
Dec  1 03:35:57 np0005540741 kernel: ftrace: allocated 193 pages with 3 groups
Dec  1 03:35:57 np0005540741 kernel: Dynamic Preempt: voluntary
Dec  1 03:35:57 np0005540741 kernel: rcu: Preemptible hierarchical RCU implementation.
Dec  1 03:35:57 np0005540741 kernel: rcu: #011RCU event tracing is enabled.
Dec  1 03:35:57 np0005540741 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec  1 03:35:57 np0005540741 kernel: #011Trampoline variant of Tasks RCU enabled.
Dec  1 03:35:57 np0005540741 kernel: #011Rude variant of Tasks RCU enabled.
Dec  1 03:35:57 np0005540741 kernel: #011Tracing variant of Tasks RCU enabled.
Dec  1 03:35:57 np0005540741 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec  1 03:35:57 np0005540741 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec  1 03:35:57 np0005540741 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  1 03:35:57 np0005540741 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  1 03:35:57 np0005540741 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  1 03:35:57 np0005540741 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec  1 03:35:57 np0005540741 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec  1 03:35:57 np0005540741 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec  1 03:35:57 np0005540741 kernel: Console: colour VGA+ 80x25
Dec  1 03:35:57 np0005540741 kernel: printk: console [ttyS0] enabled
Dec  1 03:35:57 np0005540741 kernel: ACPI: Core revision 20230331
Dec  1 03:35:57 np0005540741 kernel: APIC: Switch to symmetric I/O mode setup
Dec  1 03:35:57 np0005540741 kernel: x2apic enabled
Dec  1 03:35:57 np0005540741 kernel: APIC: Switched APIC routing to: physical x2apic
Dec  1 03:35:57 np0005540741 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec  1 03:35:57 np0005540741 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Dec  1 03:35:57 np0005540741 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec  1 03:35:57 np0005540741 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec  1 03:35:57 np0005540741 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec  1 03:35:57 np0005540741 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec  1 03:35:57 np0005540741 kernel: Spectre V2 : Mitigation: Retpolines
Dec  1 03:35:57 np0005540741 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec  1 03:35:57 np0005540741 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec  1 03:35:57 np0005540741 kernel: RETBleed: Mitigation: untrained return thunk
Dec  1 03:35:57 np0005540741 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec  1 03:35:57 np0005540741 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec  1 03:35:57 np0005540741 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec  1 03:35:57 np0005540741 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec  1 03:35:57 np0005540741 kernel: x86/bugs: return thunk changed
Dec  1 03:35:57 np0005540741 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec  1 03:35:57 np0005540741 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec  1 03:35:57 np0005540741 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec  1 03:35:57 np0005540741 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec  1 03:35:57 np0005540741 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec  1 03:35:57 np0005540741 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec  1 03:35:57 np0005540741 kernel: Freeing SMP alternatives memory: 40K
Dec  1 03:35:57 np0005540741 kernel: pid_max: default: 32768 minimum: 301
Dec  1 03:35:57 np0005540741 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec  1 03:35:57 np0005540741 kernel: landlock: Up and running.
Dec  1 03:35:57 np0005540741 kernel: Yama: becoming mindful.
Dec  1 03:35:57 np0005540741 kernel: SELinux:  Initializing.
Dec  1 03:35:57 np0005540741 kernel: LSM support for eBPF active
Dec  1 03:35:57 np0005540741 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  1 03:35:57 np0005540741 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  1 03:35:57 np0005540741 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec  1 03:35:57 np0005540741 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec  1 03:35:57 np0005540741 kernel: ... version:                0
Dec  1 03:35:57 np0005540741 kernel: ... bit width:              48
Dec  1 03:35:57 np0005540741 kernel: ... generic registers:      6
Dec  1 03:35:57 np0005540741 kernel: ... value mask:             0000ffffffffffff
Dec  1 03:35:57 np0005540741 kernel: ... max period:             00007fffffffffff
Dec  1 03:35:57 np0005540741 kernel: ... fixed-purpose events:   0
Dec  1 03:35:57 np0005540741 kernel: ... event mask:             000000000000003f
Dec  1 03:35:57 np0005540741 kernel: signal: max sigframe size: 1776
Dec  1 03:35:57 np0005540741 kernel: rcu: Hierarchical SRCU implementation.
Dec  1 03:35:57 np0005540741 kernel: rcu: #011Max phase no-delay instances is 400.
Dec  1 03:35:57 np0005540741 kernel: smp: Bringing up secondary CPUs ...
Dec  1 03:35:57 np0005540741 kernel: smpboot: x86: Booting SMP configuration:
Dec  1 03:35:57 np0005540741 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec  1 03:35:57 np0005540741 kernel: smp: Brought up 1 node, 8 CPUs
Dec  1 03:35:57 np0005540741 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Dec  1 03:35:57 np0005540741 kernel: node 0 deferred pages initialised in 16ms
Dec  1 03:35:57 np0005540741 kernel: Memory: 7765960K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 616272K reserved, 0K cma-reserved)
Dec  1 03:35:57 np0005540741 kernel: devtmpfs: initialized
Dec  1 03:35:57 np0005540741 kernel: x86/mm: Memory block size: 128MB
Dec  1 03:35:57 np0005540741 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec  1 03:35:57 np0005540741 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Dec  1 03:35:57 np0005540741 kernel: pinctrl core: initialized pinctrl subsystem
Dec  1 03:35:57 np0005540741 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec  1 03:35:57 np0005540741 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec  1 03:35:57 np0005540741 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec  1 03:35:57 np0005540741 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec  1 03:35:57 np0005540741 kernel: audit: initializing netlink subsys (disabled)
Dec  1 03:35:57 np0005540741 kernel: audit: type=2000 audit(1764578155.296:1): state=initialized audit_enabled=0 res=1
Dec  1 03:35:57 np0005540741 kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec  1 03:35:57 np0005540741 kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec  1 03:35:57 np0005540741 kernel: thermal_sys: Registered thermal governor 'user_space'
Dec  1 03:35:57 np0005540741 kernel: cpuidle: using governor menu
Dec  1 03:35:57 np0005540741 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec  1 03:35:57 np0005540741 kernel: PCI: Using configuration type 1 for base access
Dec  1 03:35:57 np0005540741 kernel: PCI: Using configuration type 1 for extended access
Dec  1 03:35:57 np0005540741 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec  1 03:35:57 np0005540741 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec  1 03:35:57 np0005540741 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec  1 03:35:57 np0005540741 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec  1 03:35:57 np0005540741 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec  1 03:35:57 np0005540741 kernel: Demotion targets for Node 0: null
Dec  1 03:35:57 np0005540741 kernel: cryptd: max_cpu_qlen set to 1000
Dec  1 03:35:57 np0005540741 kernel: ACPI: Added _OSI(Module Device)
Dec  1 03:35:57 np0005540741 kernel: ACPI: Added _OSI(Processor Device)
Dec  1 03:35:57 np0005540741 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec  1 03:35:57 np0005540741 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec  1 03:35:57 np0005540741 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec  1 03:35:57 np0005540741 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec  1 03:35:57 np0005540741 kernel: ACPI: Interpreter enabled
Dec  1 03:35:57 np0005540741 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec  1 03:35:57 np0005540741 kernel: ACPI: Using IOAPIC for interrupt routing
Dec  1 03:35:57 np0005540741 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec  1 03:35:57 np0005540741 kernel: PCI: Using E820 reservations for host bridge windows
Dec  1 03:35:57 np0005540741 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec  1 03:35:57 np0005540741 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec  1 03:35:57 np0005540741 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [3] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [4] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [5] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [6] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [7] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [8] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [9] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [10] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [11] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [12] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [13] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [14] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [15] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [16] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [17] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [18] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [19] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [20] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [21] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [22] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [23] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [24] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [25] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [26] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [27] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [28] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [29] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [30] registered
Dec  1 03:35:57 np0005540741 kernel: acpiphp: Slot [31] registered
Dec  1 03:35:57 np0005540741 kernel: PCI host bridge to bus 0000:00
Dec  1 03:35:57 np0005540741 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec  1 03:35:57 np0005540741 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec  1 03:35:57 np0005540741 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec  1 03:35:57 np0005540741 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec  1 03:35:57 np0005540741 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec  1 03:35:57 np0005540741 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec  1 03:35:57 np0005540741 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec  1 03:35:57 np0005540741 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec  1 03:35:57 np0005540741 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec  1 03:35:57 np0005540741 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec  1 03:35:57 np0005540741 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec  1 03:35:57 np0005540741 kernel: iommu: Default domain type: Translated
Dec  1 03:35:57 np0005540741 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec  1 03:35:57 np0005540741 kernel: SCSI subsystem initialized
Dec  1 03:35:57 np0005540741 kernel: ACPI: bus type USB registered
Dec  1 03:35:57 np0005540741 kernel: usbcore: registered new interface driver usbfs
Dec  1 03:35:57 np0005540741 kernel: usbcore: registered new interface driver hub
Dec  1 03:35:57 np0005540741 kernel: usbcore: registered new device driver usb
Dec  1 03:35:57 np0005540741 kernel: pps_core: LinuxPPS API ver. 1 registered
Dec  1 03:35:57 np0005540741 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec  1 03:35:57 np0005540741 kernel: PTP clock support registered
Dec  1 03:35:57 np0005540741 kernel: EDAC MC: Ver: 3.0.0
Dec  1 03:35:57 np0005540741 kernel: NetLabel: Initializing
Dec  1 03:35:57 np0005540741 kernel: NetLabel:  domain hash size = 128
Dec  1 03:35:57 np0005540741 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec  1 03:35:57 np0005540741 kernel: NetLabel:  unlabeled traffic allowed by default
Dec  1 03:35:57 np0005540741 kernel: PCI: Using ACPI for IRQ routing
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec  1 03:35:57 np0005540741 kernel: vgaarb: loaded
Dec  1 03:35:57 np0005540741 kernel: clocksource: Switched to clocksource kvm-clock
Dec  1 03:35:57 np0005540741 kernel: VFS: Disk quotas dquot_6.6.0
Dec  1 03:35:57 np0005540741 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec  1 03:35:57 np0005540741 kernel: pnp: PnP ACPI init
Dec  1 03:35:57 np0005540741 kernel: pnp: PnP ACPI: found 5 devices
Dec  1 03:35:57 np0005540741 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec  1 03:35:57 np0005540741 kernel: NET: Registered PF_INET protocol family
Dec  1 03:35:57 np0005540741 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec  1 03:35:57 np0005540741 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec  1 03:35:57 np0005540741 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec  1 03:35:57 np0005540741 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec  1 03:35:57 np0005540741 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec  1 03:35:57 np0005540741 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec  1 03:35:57 np0005540741 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec  1 03:35:57 np0005540741 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  1 03:35:57 np0005540741 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  1 03:35:57 np0005540741 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec  1 03:35:57 np0005540741 kernel: NET: Registered PF_XDP protocol family
Dec  1 03:35:57 np0005540741 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec  1 03:35:57 np0005540741 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec  1 03:35:57 np0005540741 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec  1 03:35:57 np0005540741 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec  1 03:35:57 np0005540741 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec  1 03:35:57 np0005540741 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec  1 03:35:57 np0005540741 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 84515 usecs
Dec  1 03:35:57 np0005540741 kernel: PCI: CLS 0 bytes, default 64
Dec  1 03:35:57 np0005540741 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec  1 03:35:57 np0005540741 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec  1 03:35:57 np0005540741 kernel: Trying to unpack rootfs image as initramfs...
Dec  1 03:35:57 np0005540741 kernel: ACPI: bus type thunderbolt registered
Dec  1 03:35:57 np0005540741 kernel: Initialise system trusted keyrings
Dec  1 03:35:57 np0005540741 kernel: Key type blacklist registered
Dec  1 03:35:57 np0005540741 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec  1 03:35:57 np0005540741 kernel: zbud: loaded
Dec  1 03:35:57 np0005540741 kernel: integrity: Platform Keyring initialized
Dec  1 03:35:57 np0005540741 kernel: integrity: Machine keyring initialized
Dec  1 03:35:57 np0005540741 kernel: Freeing initrd memory: 85868K
Dec  1 03:35:57 np0005540741 kernel: NET: Registered PF_ALG protocol family
Dec  1 03:35:57 np0005540741 kernel: xor: automatically using best checksumming function   avx       
Dec  1 03:35:57 np0005540741 kernel: Key type asymmetric registered
Dec  1 03:35:57 np0005540741 kernel: Asymmetric key parser 'x509' registered
Dec  1 03:35:57 np0005540741 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec  1 03:35:57 np0005540741 kernel: io scheduler mq-deadline registered
Dec  1 03:35:57 np0005540741 kernel: io scheduler kyber registered
Dec  1 03:35:57 np0005540741 kernel: io scheduler bfq registered
Dec  1 03:35:57 np0005540741 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec  1 03:35:57 np0005540741 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec  1 03:35:57 np0005540741 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec  1 03:35:57 np0005540741 kernel: ACPI: button: Power Button [PWRF]
Dec  1 03:35:57 np0005540741 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec  1 03:35:57 np0005540741 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec  1 03:35:57 np0005540741 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec  1 03:35:57 np0005540741 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec  1 03:35:57 np0005540741 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec  1 03:35:57 np0005540741 kernel: Non-volatile memory driver v1.3
Dec  1 03:35:57 np0005540741 kernel: rdac: device handler registered
Dec  1 03:35:57 np0005540741 kernel: hp_sw: device handler registered
Dec  1 03:35:57 np0005540741 kernel: emc: device handler registered
Dec  1 03:35:57 np0005540741 kernel: alua: device handler registered
Dec  1 03:35:57 np0005540741 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec  1 03:35:57 np0005540741 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec  1 03:35:57 np0005540741 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec  1 03:35:57 np0005540741 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec  1 03:35:57 np0005540741 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec  1 03:35:57 np0005540741 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec  1 03:35:57 np0005540741 kernel: usb usb1: Product: UHCI Host Controller
Dec  1 03:35:57 np0005540741 kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Dec  1 03:35:57 np0005540741 kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec  1 03:35:57 np0005540741 kernel: hub 1-0:1.0: USB hub found
Dec  1 03:35:57 np0005540741 kernel: hub 1-0:1.0: 2 ports detected
Dec  1 03:35:57 np0005540741 kernel: usbcore: registered new interface driver usbserial_generic
Dec  1 03:35:57 np0005540741 kernel: usbserial: USB Serial support registered for generic
Dec  1 03:35:57 np0005540741 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec  1 03:35:57 np0005540741 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec  1 03:35:57 np0005540741 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec  1 03:35:57 np0005540741 kernel: mousedev: PS/2 mouse device common for all mice
Dec  1 03:35:57 np0005540741 kernel: rtc_cmos 00:04: RTC can wake from S4
Dec  1 03:35:57 np0005540741 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec  1 03:35:57 np0005540741 kernel: rtc_cmos 00:04: registered as rtc0
Dec  1 03:35:57 np0005540741 kernel: rtc_cmos 00:04: setting system clock to 2025-12-01T08:35:56 UTC (1764578156)
Dec  1 03:35:57 np0005540741 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec  1 03:35:57 np0005540741 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec  1 03:35:57 np0005540741 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec  1 03:35:57 np0005540741 kernel: hid: raw HID events driver (C) Jiri Kosina
Dec  1 03:35:57 np0005540741 kernel: usbcore: registered new interface driver usbhid
Dec  1 03:35:57 np0005540741 kernel: usbhid: USB HID core driver
Dec  1 03:35:57 np0005540741 kernel: drop_monitor: Initializing network drop monitor service
Dec  1 03:35:57 np0005540741 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec  1 03:35:57 np0005540741 kernel: Initializing XFRM netlink socket
Dec  1 03:35:57 np0005540741 kernel: NET: Registered PF_INET6 protocol family
Dec  1 03:35:57 np0005540741 kernel: Segment Routing with IPv6
Dec  1 03:35:57 np0005540741 kernel: NET: Registered PF_PACKET protocol family
Dec  1 03:35:57 np0005540741 kernel: mpls_gso: MPLS GSO support
Dec  1 03:35:57 np0005540741 kernel: IPI shorthand broadcast: enabled
Dec  1 03:35:57 np0005540741 kernel: AVX2 version of gcm_enc/dec engaged.
Dec  1 03:35:57 np0005540741 kernel: AES CTR mode by8 optimization enabled
Dec  1 03:35:57 np0005540741 kernel: sched_clock: Marking stable (1513001650, 140242759)->(1780771269, -127526860)
Dec  1 03:35:57 np0005540741 kernel: registered taskstats version 1
Dec  1 03:35:57 np0005540741 kernel: Loading compiled-in X.509 certificates
Dec  1 03:35:57 np0005540741 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Dec  1 03:35:57 np0005540741 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec  1 03:35:57 np0005540741 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec  1 03:35:57 np0005540741 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec  1 03:35:57 np0005540741 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec  1 03:35:57 np0005540741 kernel: Demotion targets for Node 0: null
Dec  1 03:35:57 np0005540741 kernel: page_owner is disabled
Dec  1 03:35:57 np0005540741 kernel: Key type .fscrypt registered
Dec  1 03:35:57 np0005540741 kernel: Key type fscrypt-provisioning registered
Dec  1 03:35:57 np0005540741 kernel: Key type big_key registered
Dec  1 03:35:57 np0005540741 kernel: Key type encrypted registered
Dec  1 03:35:57 np0005540741 kernel: ima: No TPM chip found, activating TPM-bypass!
Dec  1 03:35:57 np0005540741 kernel: Loading compiled-in module X.509 certificates
Dec  1 03:35:57 np0005540741 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Dec  1 03:35:57 np0005540741 kernel: ima: Allocated hash algorithm: sha256
Dec  1 03:35:57 np0005540741 kernel: ima: No architecture policies found
Dec  1 03:35:57 np0005540741 kernel: evm: Initialising EVM extended attributes:
Dec  1 03:35:57 np0005540741 kernel: evm: security.selinux
Dec  1 03:35:57 np0005540741 kernel: evm: security.SMACK64 (disabled)
Dec  1 03:35:57 np0005540741 kernel: evm: security.SMACK64EXEC (disabled)
Dec  1 03:35:57 np0005540741 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec  1 03:35:57 np0005540741 kernel: evm: security.SMACK64MMAP (disabled)
Dec  1 03:35:57 np0005540741 kernel: evm: security.apparmor (disabled)
Dec  1 03:35:57 np0005540741 kernel: evm: security.ima
Dec  1 03:35:57 np0005540741 kernel: evm: security.capability
Dec  1 03:35:57 np0005540741 kernel: evm: HMAC attrs: 0x1
Dec  1 03:35:57 np0005540741 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec  1 03:35:57 np0005540741 kernel: Running certificate verification RSA selftest
Dec  1 03:35:57 np0005540741 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec  1 03:35:57 np0005540741 kernel: Running certificate verification ECDSA selftest
Dec  1 03:35:57 np0005540741 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec  1 03:35:57 np0005540741 kernel: clk: Disabling unused clocks
Dec  1 03:35:57 np0005540741 kernel: Freeing unused decrypted memory: 2028K
Dec  1 03:35:57 np0005540741 kernel: Freeing unused kernel image (initmem) memory: 4192K
Dec  1 03:35:57 np0005540741 kernel: Write protecting the kernel read-only data: 30720k
Dec  1 03:35:57 np0005540741 kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Dec  1 03:35:57 np0005540741 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec  1 03:35:57 np0005540741 kernel: Run /init as init process
Dec  1 03:35:57 np0005540741 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  1 03:35:57 np0005540741 systemd: Detected virtualization kvm.
Dec  1 03:35:57 np0005540741 systemd: Detected architecture x86-64.
Dec  1 03:35:57 np0005540741 systemd: Running in initrd.
Dec  1 03:35:57 np0005540741 systemd: No hostname configured, using default hostname.
Dec  1 03:35:57 np0005540741 systemd: Hostname set to <localhost>.
Dec  1 03:35:57 np0005540741 systemd: Initializing machine ID from VM UUID.
Dec  1 03:35:57 np0005540741 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec  1 03:35:57 np0005540741 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec  1 03:35:57 np0005540741 kernel: usb 1-1: Product: QEMU USB Tablet
Dec  1 03:35:57 np0005540741 kernel: usb 1-1: Manufacturer: QEMU
Dec  1 03:35:57 np0005540741 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec  1 03:35:57 np0005540741 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec  1 03:35:57 np0005540741 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec  1 03:35:57 np0005540741 systemd: Queued start job for default target Initrd Default Target.
Dec  1 03:35:57 np0005540741 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  1 03:35:57 np0005540741 systemd: Reached target Local Encrypted Volumes.
Dec  1 03:35:57 np0005540741 systemd: Reached target Initrd /usr File System.
Dec  1 03:35:57 np0005540741 systemd: Reached target Local File Systems.
Dec  1 03:35:57 np0005540741 systemd: Reached target Path Units.
Dec  1 03:35:57 np0005540741 systemd: Reached target Slice Units.
Dec  1 03:35:57 np0005540741 systemd: Reached target Swaps.
Dec  1 03:35:57 np0005540741 systemd: Reached target Timer Units.
Dec  1 03:35:57 np0005540741 systemd: Listening on D-Bus System Message Bus Socket.
Dec  1 03:35:57 np0005540741 systemd: Listening on Journal Socket (/dev/log).
Dec  1 03:35:57 np0005540741 systemd: Listening on Journal Socket.
Dec  1 03:35:57 np0005540741 systemd: Listening on udev Control Socket.
Dec  1 03:35:57 np0005540741 systemd: Listening on udev Kernel Socket.
Dec  1 03:35:57 np0005540741 systemd: Reached target Socket Units.
Dec  1 03:35:57 np0005540741 systemd: Starting Create List of Static Device Nodes...
Dec  1 03:35:57 np0005540741 systemd: Starting Journal Service...
Dec  1 03:35:57 np0005540741 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  1 03:35:57 np0005540741 systemd: Starting Apply Kernel Variables...
Dec  1 03:35:57 np0005540741 systemd: Starting Create System Users...
Dec  1 03:35:57 np0005540741 systemd: Starting Setup Virtual Console...
Dec  1 03:35:57 np0005540741 systemd: Finished Create List of Static Device Nodes.
Dec  1 03:35:57 np0005540741 systemd: Finished Apply Kernel Variables.
Dec  1 03:35:57 np0005540741 systemd: Finished Create System Users.
Dec  1 03:35:57 np0005540741 systemd-journald[305]: Journal started
Dec  1 03:35:57 np0005540741 systemd-journald[305]: Runtime Journal (/run/log/journal/523109271d304bda9d2bfd9f7cfadc4d) is 8.0M, max 153.6M, 145.6M free.
Dec  1 03:35:57 np0005540741 systemd-sysusers[310]: Creating group 'users' with GID 100.
Dec  1 03:35:57 np0005540741 systemd-sysusers[310]: Creating group 'dbus' with GID 81.
Dec  1 03:35:57 np0005540741 systemd-sysusers[310]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec  1 03:35:57 np0005540741 systemd: Started Journal Service.
Dec  1 03:35:57 np0005540741 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec  1 03:35:57 np0005540741 systemd[1]: Starting Create Volatile Files and Directories...
Dec  1 03:35:57 np0005540741 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  1 03:35:57 np0005540741 systemd[1]: Finished Create Volatile Files and Directories.
Dec  1 03:35:57 np0005540741 systemd[1]: Finished Setup Virtual Console.
Dec  1 03:35:57 np0005540741 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec  1 03:35:57 np0005540741 systemd[1]: Starting dracut cmdline hook...
Dec  1 03:35:57 np0005540741 dracut-cmdline[325]: dracut-9 dracut-057-102.git20250818.el9
Dec  1 03:35:57 np0005540741 dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  1 03:35:57 np0005540741 systemd[1]: Finished dracut cmdline hook.
Dec  1 03:35:57 np0005540741 systemd[1]: Starting dracut pre-udev hook...
Dec  1 03:35:57 np0005540741 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec  1 03:35:57 np0005540741 kernel: device-mapper: uevent: version 1.0.3
Dec  1 03:35:57 np0005540741 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec  1 03:35:57 np0005540741 kernel: RPC: Registered named UNIX socket transport module.
Dec  1 03:35:57 np0005540741 kernel: RPC: Registered udp transport module.
Dec  1 03:35:57 np0005540741 kernel: RPC: Registered tcp transport module.
Dec  1 03:35:57 np0005540741 kernel: RPC: Registered tcp-with-tls transport module.
Dec  1 03:35:57 np0005540741 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec  1 03:35:57 np0005540741 rpc.statd[441]: Version 2.5.4 starting
Dec  1 03:35:57 np0005540741 rpc.statd[441]: Initializing NSM state
Dec  1 03:35:57 np0005540741 rpc.idmapd[446]: Setting log level to 0
Dec  1 03:35:57 np0005540741 systemd[1]: Finished dracut pre-udev hook.
Dec  1 03:35:57 np0005540741 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  1 03:35:57 np0005540741 systemd-udevd[459]: Using default interface naming scheme 'rhel-9.0'.
Dec  1 03:35:57 np0005540741 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  1 03:35:57 np0005540741 systemd[1]: Starting dracut pre-trigger hook...
Dec  1 03:35:57 np0005540741 systemd[1]: Finished dracut pre-trigger hook.
Dec  1 03:35:57 np0005540741 systemd[1]: Starting Coldplug All udev Devices...
Dec  1 03:35:57 np0005540741 systemd[1]: Created slice Slice /system/modprobe.
Dec  1 03:35:58 np0005540741 systemd[1]: Starting Load Kernel Module configfs...
Dec  1 03:35:58 np0005540741 systemd[1]: Finished Coldplug All udev Devices.
Dec  1 03:35:58 np0005540741 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  1 03:35:58 np0005540741 systemd[1]: Finished Load Kernel Module configfs.
Dec  1 03:35:58 np0005540741 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  1 03:35:58 np0005540741 systemd[1]: Reached target Network.
Dec  1 03:35:58 np0005540741 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  1 03:35:58 np0005540741 systemd[1]: Starting dracut initqueue hook...
Dec  1 03:35:58 np0005540741 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec  1 03:35:58 np0005540741 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec  1 03:35:58 np0005540741 kernel: vda: vda1
Dec  1 03:35:58 np0005540741 kernel: scsi host0: ata_piix
Dec  1 03:35:58 np0005540741 kernel: scsi host1: ata_piix
Dec  1 03:35:58 np0005540741 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec  1 03:35:58 np0005540741 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec  1 03:35:58 np0005540741 systemd[1]: Mounting Kernel Configuration File System...
Dec  1 03:35:58 np0005540741 systemd[1]: Mounted Kernel Configuration File System.
Dec  1 03:35:58 np0005540741 systemd[1]: Found device /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Dec  1 03:35:58 np0005540741 systemd[1]: Reached target Initrd Root Device.
Dec  1 03:35:58 np0005540741 systemd[1]: Reached target System Initialization.
Dec  1 03:35:58 np0005540741 systemd[1]: Reached target Basic System.
Dec  1 03:35:58 np0005540741 kernel: ata1: found unknown device (class 0)
Dec  1 03:35:58 np0005540741 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec  1 03:35:58 np0005540741 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec  1 03:35:58 np0005540741 systemd-udevd[478]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 03:35:58 np0005540741 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec  1 03:35:58 np0005540741 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec  1 03:35:58 np0005540741 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec  1 03:35:58 np0005540741 systemd[1]: Finished dracut initqueue hook.
Dec  1 03:35:58 np0005540741 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  1 03:35:58 np0005540741 systemd[1]: Reached target Remote Encrypted Volumes.
Dec  1 03:35:58 np0005540741 systemd[1]: Reached target Remote File Systems.
Dec  1 03:35:58 np0005540741 systemd[1]: Starting dracut pre-mount hook...
Dec  1 03:35:58 np0005540741 systemd[1]: Finished dracut pre-mount hook.
Dec  1 03:35:58 np0005540741 systemd[1]: Starting File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253...
Dec  1 03:35:58 np0005540741 systemd-fsck[554]: /usr/sbin/fsck.xfs: XFS file system.
Dec  1 03:35:58 np0005540741 systemd[1]: Finished File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Dec  1 03:35:58 np0005540741 systemd[1]: Mounting /sysroot...
Dec  1 03:35:58 np0005540741 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec  1 03:35:58 np0005540741 kernel: XFS (vda1): Mounting V5 Filesystem b277050f-8ace-464d-abb6-4c46d4c45253
Dec  1 03:35:59 np0005540741 kernel: XFS (vda1): Ending clean mount
Dec  1 03:35:59 np0005540741 systemd[1]: Mounted /sysroot.
Dec  1 03:35:59 np0005540741 systemd[1]: Reached target Initrd Root File System.
Dec  1 03:35:59 np0005540741 systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec  1 03:35:59 np0005540741 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec  1 03:35:59 np0005540741 systemd[1]: Reached target Initrd File Systems.
Dec  1 03:35:59 np0005540741 systemd[1]: Reached target Initrd Default Target.
Dec  1 03:35:59 np0005540741 systemd[1]: Starting dracut mount hook...
Dec  1 03:35:59 np0005540741 systemd[1]: Finished dracut mount hook.
Dec  1 03:35:59 np0005540741 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec  1 03:35:59 np0005540741 rpc.idmapd[446]: exiting on signal 15
Dec  1 03:35:59 np0005540741 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec  1 03:35:59 np0005540741 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target Network.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target Remote Encrypted Volumes.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target Timer Units.
Dec  1 03:35:59 np0005540741 systemd[1]: dbus.socket: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Closed D-Bus System Message Bus Socket.
Dec  1 03:35:59 np0005540741 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target Initrd Default Target.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target Basic System.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target Initrd Root Device.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target Initrd /usr File System.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target Path Units.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target Remote File Systems.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target Preparation for Remote File Systems.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target Slice Units.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target Socket Units.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target System Initialization.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target Local File Systems.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target Swaps.
Dec  1 03:35:59 np0005540741 systemd[1]: dracut-mount.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped dracut mount hook.
Dec  1 03:35:59 np0005540741 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped dracut pre-mount hook.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped target Local Encrypted Volumes.
Dec  1 03:35:59 np0005540741 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec  1 03:35:59 np0005540741 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped dracut initqueue hook.
Dec  1 03:35:59 np0005540741 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped Apply Kernel Variables.
Dec  1 03:35:59 np0005540741 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped Create Volatile Files and Directories.
Dec  1 03:35:59 np0005540741 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped Coldplug All udev Devices.
Dec  1 03:35:59 np0005540741 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped dracut pre-trigger hook.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec  1 03:35:59 np0005540741 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped Setup Virtual Console.
Dec  1 03:35:59 np0005540741 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec  1 03:35:59 np0005540741 systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec  1 03:35:59 np0005540741 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Closed udev Control Socket.
Dec  1 03:35:59 np0005540741 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Closed udev Kernel Socket.
Dec  1 03:35:59 np0005540741 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped dracut pre-udev hook.
Dec  1 03:35:59 np0005540741 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped dracut cmdline hook.
Dec  1 03:35:59 np0005540741 systemd[1]: Starting Cleanup udev Database...
Dec  1 03:35:59 np0005540741 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec  1 03:35:59 np0005540741 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped Create List of Static Device Nodes.
Dec  1 03:35:59 np0005540741 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Stopped Create System Users.
Dec  1 03:35:59 np0005540741 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec  1 03:35:59 np0005540741 systemd[1]: Finished Cleanup udev Database.
Dec  1 03:35:59 np0005540741 systemd[1]: Reached target Switch Root.
Dec  1 03:35:59 np0005540741 systemd[1]: Starting Switch Root...
Dec  1 03:35:59 np0005540741 systemd[1]: Switching root.
Dec  1 03:35:59 np0005540741 systemd-journald[305]: Journal stopped
Dec  1 03:36:00 np0005540741 systemd-journald: Received SIGTERM from PID 1 (systemd).
Dec  1 03:36:00 np0005540741 kernel: audit: type=1404 audit(1764578159.503:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec  1 03:36:00 np0005540741 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 03:36:00 np0005540741 kernel: SELinux:  policy capability open_perms=1
Dec  1 03:36:00 np0005540741 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 03:36:00 np0005540741 kernel: SELinux:  policy capability always_check_network=0
Dec  1 03:36:00 np0005540741 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 03:36:00 np0005540741 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 03:36:00 np0005540741 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 03:36:00 np0005540741 kernel: audit: type=1403 audit(1764578159.665:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec  1 03:36:00 np0005540741 systemd: Successfully loaded SELinux policy in 165.213ms.
Dec  1 03:36:00 np0005540741 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.076ms.
Dec  1 03:36:00 np0005540741 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  1 03:36:00 np0005540741 systemd: Detected virtualization kvm.
Dec  1 03:36:00 np0005540741 systemd: Detected architecture x86-64.
Dec  1 03:36:00 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 03:36:00 np0005540741 systemd: initrd-switch-root.service: Deactivated successfully.
Dec  1 03:36:00 np0005540741 systemd: Stopped Switch Root.
Dec  1 03:36:00 np0005540741 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec  1 03:36:00 np0005540741 systemd: Created slice Slice /system/getty.
Dec  1 03:36:00 np0005540741 systemd: Created slice Slice /system/serial-getty.
Dec  1 03:36:00 np0005540741 systemd: Created slice Slice /system/sshd-keygen.
Dec  1 03:36:00 np0005540741 systemd: Created slice User and Session Slice.
Dec  1 03:36:00 np0005540741 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  1 03:36:00 np0005540741 systemd: Started Forward Password Requests to Wall Directory Watch.
Dec  1 03:36:00 np0005540741 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec  1 03:36:00 np0005540741 systemd: Reached target Local Encrypted Volumes.
Dec  1 03:36:00 np0005540741 systemd: Stopped target Switch Root.
Dec  1 03:36:00 np0005540741 systemd: Stopped target Initrd File Systems.
Dec  1 03:36:00 np0005540741 systemd: Stopped target Initrd Root File System.
Dec  1 03:36:00 np0005540741 systemd: Reached target Local Integrity Protected Volumes.
Dec  1 03:36:00 np0005540741 systemd: Reached target Path Units.
Dec  1 03:36:00 np0005540741 systemd: Reached target rpc_pipefs.target.
Dec  1 03:36:00 np0005540741 systemd: Reached target Slice Units.
Dec  1 03:36:00 np0005540741 systemd: Reached target Swaps.
Dec  1 03:36:00 np0005540741 systemd: Reached target Local Verity Protected Volumes.
Dec  1 03:36:00 np0005540741 systemd: Listening on RPCbind Server Activation Socket.
Dec  1 03:36:00 np0005540741 systemd: Reached target RPC Port Mapper.
Dec  1 03:36:00 np0005540741 systemd: Listening on Process Core Dump Socket.
Dec  1 03:36:00 np0005540741 systemd: Listening on initctl Compatibility Named Pipe.
Dec  1 03:36:00 np0005540741 systemd: Listening on udev Control Socket.
Dec  1 03:36:00 np0005540741 systemd: Listening on udev Kernel Socket.
Dec  1 03:36:00 np0005540741 systemd: Mounting Huge Pages File System...
Dec  1 03:36:00 np0005540741 systemd: Mounting POSIX Message Queue File System...
Dec  1 03:36:00 np0005540741 systemd: Mounting Kernel Debug File System...
Dec  1 03:36:00 np0005540741 systemd: Mounting Kernel Trace File System...
Dec  1 03:36:00 np0005540741 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  1 03:36:00 np0005540741 systemd: Starting Create List of Static Device Nodes...
Dec  1 03:36:00 np0005540741 systemd: Starting Load Kernel Module configfs...
Dec  1 03:36:00 np0005540741 systemd: Starting Load Kernel Module drm...
Dec  1 03:36:00 np0005540741 systemd: Starting Load Kernel Module efi_pstore...
Dec  1 03:36:00 np0005540741 systemd: Starting Load Kernel Module fuse...
Dec  1 03:36:00 np0005540741 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec  1 03:36:00 np0005540741 systemd: systemd-fsck-root.service: Deactivated successfully.
Dec  1 03:36:00 np0005540741 systemd: Stopped File System Check on Root Device.
Dec  1 03:36:00 np0005540741 systemd: Stopped Journal Service.
Dec  1 03:36:00 np0005540741 kernel: fuse: init (API version 7.37)
Dec  1 03:36:00 np0005540741 systemd: Starting Journal Service...
Dec  1 03:36:00 np0005540741 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  1 03:36:00 np0005540741 systemd: Starting Generate network units from Kernel command line...
Dec  1 03:36:00 np0005540741 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  1 03:36:00 np0005540741 systemd: Starting Remount Root and Kernel File Systems...
Dec  1 03:36:00 np0005540741 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec  1 03:36:00 np0005540741 systemd: Starting Apply Kernel Variables...
Dec  1 03:36:00 np0005540741 systemd: Starting Coldplug All udev Devices...
Dec  1 03:36:00 np0005540741 systemd-journald[677]: Journal started
Dec  1 03:36:00 np0005540741 systemd-journald[677]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Dec  1 03:36:00 np0005540741 systemd[1]: Queued start job for default target Multi-User System.
Dec  1 03:36:00 np0005540741 systemd[1]: systemd-journald.service: Deactivated successfully.
Dec  1 03:36:00 np0005540741 systemd: Started Journal Service.
Dec  1 03:36:00 np0005540741 systemd[1]: Mounted Huge Pages File System.
Dec  1 03:36:00 np0005540741 systemd[1]: Mounted POSIX Message Queue File System.
Dec  1 03:36:00 np0005540741 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec  1 03:36:00 np0005540741 kernel: ACPI: bus type drm_connector registered
Dec  1 03:36:00 np0005540741 systemd[1]: Mounted Kernel Debug File System.
Dec  1 03:36:00 np0005540741 systemd[1]: Mounted Kernel Trace File System.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Create List of Static Device Nodes.
Dec  1 03:36:00 np0005540741 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Load Kernel Module configfs.
Dec  1 03:36:00 np0005540741 systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Load Kernel Module drm.
Dec  1 03:36:00 np0005540741 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Load Kernel Module efi_pstore.
Dec  1 03:36:00 np0005540741 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Load Kernel Module fuse.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Generate network units from Kernel command line.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Remount Root and Kernel File Systems.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Apply Kernel Variables.
Dec  1 03:36:00 np0005540741 systemd[1]: Mounting FUSE Control File System...
Dec  1 03:36:00 np0005540741 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  1 03:36:00 np0005540741 systemd[1]: Starting Rebuild Hardware Database...
Dec  1 03:36:00 np0005540741 systemd[1]: Starting Flush Journal to Persistent Storage...
Dec  1 03:36:00 np0005540741 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec  1 03:36:00 np0005540741 systemd[1]: Starting Load/Save OS Random Seed...
Dec  1 03:36:00 np0005540741 systemd[1]: Starting Create System Users...
Dec  1 03:36:00 np0005540741 systemd-journald[677]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Dec  1 03:36:00 np0005540741 systemd-journald[677]: Received client request to flush runtime journal.
Dec  1 03:36:00 np0005540741 systemd[1]: Mounted FUSE Control File System.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Flush Journal to Persistent Storage.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Load/Save OS Random Seed.
Dec  1 03:36:00 np0005540741 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Create System Users.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Coldplug All udev Devices.
Dec  1 03:36:00 np0005540741 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  1 03:36:00 np0005540741 systemd[1]: Reached target Preparation for Local File Systems.
Dec  1 03:36:00 np0005540741 systemd[1]: Reached target Local File Systems.
Dec  1 03:36:00 np0005540741 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec  1 03:36:00 np0005540741 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec  1 03:36:00 np0005540741 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec  1 03:36:00 np0005540741 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec  1 03:36:00 np0005540741 systemd[1]: Starting Automatic Boot Loader Update...
Dec  1 03:36:00 np0005540741 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec  1 03:36:00 np0005540741 systemd[1]: Starting Create Volatile Files and Directories...
Dec  1 03:36:00 np0005540741 bootctl[696]: Couldn't find EFI system partition, skipping.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Automatic Boot Loader Update.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Create Volatile Files and Directories.
Dec  1 03:36:00 np0005540741 systemd[1]: Starting Security Auditing Service...
Dec  1 03:36:00 np0005540741 systemd[1]: Starting RPC Bind...
Dec  1 03:36:00 np0005540741 systemd[1]: Starting Rebuild Journal Catalog...
Dec  1 03:36:00 np0005540741 auditd[702]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec  1 03:36:00 np0005540741 auditd[702]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec  1 03:36:00 np0005540741 systemd[1]: Started RPC Bind.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Rebuild Journal Catalog.
Dec  1 03:36:00 np0005540741 augenrules[707]: /sbin/augenrules: No change
Dec  1 03:36:00 np0005540741 augenrules[723]: No rules
Dec  1 03:36:00 np0005540741 augenrules[723]: enabled 1
Dec  1 03:36:00 np0005540741 augenrules[723]: failure 1
Dec  1 03:36:00 np0005540741 augenrules[723]: pid 702
Dec  1 03:36:00 np0005540741 augenrules[723]: rate_limit 0
Dec  1 03:36:00 np0005540741 augenrules[723]: backlog_limit 8192
Dec  1 03:36:00 np0005540741 augenrules[723]: lost 0
Dec  1 03:36:00 np0005540741 augenrules[723]: backlog 2
Dec  1 03:36:00 np0005540741 augenrules[723]: backlog_wait_time 60000
Dec  1 03:36:00 np0005540741 augenrules[723]: backlog_wait_time_actual 0
Dec  1 03:36:00 np0005540741 augenrules[723]: enabled 1
Dec  1 03:36:00 np0005540741 augenrules[723]: failure 1
Dec  1 03:36:00 np0005540741 augenrules[723]: pid 702
Dec  1 03:36:00 np0005540741 augenrules[723]: rate_limit 0
Dec  1 03:36:00 np0005540741 augenrules[723]: backlog_limit 8192
Dec  1 03:36:00 np0005540741 augenrules[723]: lost 0
Dec  1 03:36:00 np0005540741 augenrules[723]: backlog 2
Dec  1 03:36:00 np0005540741 augenrules[723]: backlog_wait_time 60000
Dec  1 03:36:00 np0005540741 augenrules[723]: backlog_wait_time_actual 0
Dec  1 03:36:00 np0005540741 augenrules[723]: enabled 1
Dec  1 03:36:00 np0005540741 augenrules[723]: failure 1
Dec  1 03:36:00 np0005540741 augenrules[723]: pid 702
Dec  1 03:36:00 np0005540741 augenrules[723]: rate_limit 0
Dec  1 03:36:00 np0005540741 augenrules[723]: backlog_limit 8192
Dec  1 03:36:00 np0005540741 augenrules[723]: lost 0
Dec  1 03:36:00 np0005540741 augenrules[723]: backlog 0
Dec  1 03:36:00 np0005540741 augenrules[723]: backlog_wait_time 60000
Dec  1 03:36:00 np0005540741 augenrules[723]: backlog_wait_time_actual 0
Dec  1 03:36:00 np0005540741 systemd[1]: Started Security Auditing Service.
Dec  1 03:36:00 np0005540741 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec  1 03:36:00 np0005540741 systemd[1]: Finished Rebuild Hardware Database.
Dec  1 03:36:00 np0005540741 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  1 03:36:01 np0005540741 systemd[1]: Starting Update is Completed...
Dec  1 03:36:01 np0005540741 systemd[1]: Finished Update is Completed.
Dec  1 03:36:01 np0005540741 systemd-udevd[731]: Using default interface naming scheme 'rhel-9.0'.
Dec  1 03:36:01 np0005540741 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  1 03:36:01 np0005540741 systemd[1]: Reached target System Initialization.
Dec  1 03:36:01 np0005540741 systemd[1]: Started dnf makecache --timer.
Dec  1 03:36:01 np0005540741 systemd[1]: Started Daily rotation of log files.
Dec  1 03:36:01 np0005540741 systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec  1 03:36:01 np0005540741 systemd[1]: Reached target Timer Units.
Dec  1 03:36:01 np0005540741 systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec  1 03:36:01 np0005540741 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec  1 03:36:01 np0005540741 systemd[1]: Reached target Socket Units.
Dec  1 03:36:01 np0005540741 systemd[1]: Starting D-Bus System Message Bus...
Dec  1 03:36:01 np0005540741 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  1 03:36:01 np0005540741 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec  1 03:36:01 np0005540741 systemd[1]: Starting Load Kernel Module configfs...
Dec  1 03:36:01 np0005540741 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  1 03:36:01 np0005540741 systemd[1]: Finished Load Kernel Module configfs.
Dec  1 03:36:01 np0005540741 systemd-udevd[736]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 03:36:01 np0005540741 systemd[1]: Started D-Bus System Message Bus.
Dec  1 03:36:01 np0005540741 systemd[1]: Reached target Basic System.
Dec  1 03:36:01 np0005540741 dbus-broker-lau[757]: Ready
Dec  1 03:36:01 np0005540741 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec  1 03:36:01 np0005540741 systemd[1]: Starting NTP client/server...
Dec  1 03:36:01 np0005540741 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec  1 03:36:01 np0005540741 systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec  1 03:36:01 np0005540741 systemd[1]: Starting IPv4 firewall with iptables...
Dec  1 03:36:01 np0005540741 systemd[1]: Started irqbalance daemon.
Dec  1 03:36:01 np0005540741 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec  1 03:36:01 np0005540741 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 03:36:01 np0005540741 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 03:36:01 np0005540741 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 03:36:01 np0005540741 systemd[1]: Reached target sshd-keygen.target.
Dec  1 03:36:01 np0005540741 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec  1 03:36:01 np0005540741 systemd[1]: Reached target User and Group Name Lookups.
Dec  1 03:36:01 np0005540741 systemd[1]: Starting User Login Management...
Dec  1 03:36:01 np0005540741 systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec  1 03:36:01 np0005540741 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec  1 03:36:01 np0005540741 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec  1 03:36:01 np0005540741 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec  1 03:36:01 np0005540741 chronyd[791]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  1 03:36:01 np0005540741 chronyd[791]: Loaded 0 symmetric keys
Dec  1 03:36:01 np0005540741 chronyd[791]: Using right/UTC timezone to obtain leap second data
Dec  1 03:36:01 np0005540741 chronyd[791]: Loaded seccomp filter (level 2)
Dec  1 03:36:01 np0005540741 systemd[1]: Started NTP client/server.
Dec  1 03:36:01 np0005540741 systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  1 03:36:01 np0005540741 systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  1 03:36:01 np0005540741 systemd-logind[788]: New seat seat0.
Dec  1 03:36:01 np0005540741 systemd[1]: Started User Login Management.
Dec  1 03:36:01 np0005540741 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec  1 03:36:01 np0005540741 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec  1 03:36:01 np0005540741 kernel: Console: switching to colour dummy device 80x25
Dec  1 03:36:01 np0005540741 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec  1 03:36:01 np0005540741 kernel: [drm] features: -context_init
Dec  1 03:36:01 np0005540741 kernel: [drm] number of scanouts: 1
Dec  1 03:36:01 np0005540741 kernel: [drm] number of cap sets: 0
Dec  1 03:36:01 np0005540741 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec  1 03:36:01 np0005540741 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec  1 03:36:01 np0005540741 kernel: Console: switching to colour frame buffer device 128x48
Dec  1 03:36:01 np0005540741 kernel: kvm_amd: TSC scaling supported
Dec  1 03:36:01 np0005540741 kernel: kvm_amd: Nested Virtualization enabled
Dec  1 03:36:01 np0005540741 kernel: kvm_amd: Nested Paging enabled
Dec  1 03:36:01 np0005540741 kernel: kvm_amd: LBR virtualization supported
Dec  1 03:36:01 np0005540741 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec  1 03:36:01 np0005540741 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec  1 03:36:01 np0005540741 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec  1 03:36:01 np0005540741 iptables.init[781]: iptables: Applying firewall rules: [  OK  ]
Dec  1 03:36:01 np0005540741 systemd[1]: Finished IPv4 firewall with iptables.
Dec  1 03:36:01 np0005540741 cloud-init[839]: Cloud-init v. 24.4-7.el9 running 'init-local' at Mon, 01 Dec 2025 08:36:01 +0000. Up 6.64 seconds.
Dec  1 03:36:01 np0005540741 systemd[1]: run-cloud\x2dinit-tmp-tmp5905b2m4.mount: Deactivated successfully.
Dec  1 03:36:01 np0005540741 systemd[1]: Starting Hostname Service...
Dec  1 03:36:02 np0005540741 systemd[1]: Started Hostname Service.
Dec  1 03:36:02 np0005540741 systemd-hostnamed[853]: Hostname set to <np0005540741.novalocal> (static)
Dec  1 03:36:02 np0005540741 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec  1 03:36:02 np0005540741 systemd[1]: Reached target Preparation for Network.
Dec  1 03:36:02 np0005540741 systemd[1]: Starting Network Manager...
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.1846] NetworkManager (version 1.54.1-1.el9) is starting... (boot:fbf967f0-219c-4ceb-b589-3e4f3756d2b4)
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.1850] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2031] manager[0x562f6a532080]: monitoring kernel firmware directory '/lib/firmware'.
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2077] hostname: hostname: using hostnamed
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2077] hostname: static hostname changed from (none) to "np0005540741.novalocal"
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2080] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2170] manager[0x562f6a532080]: rfkill: Wi-Fi hardware radio set enabled
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2170] manager[0x562f6a532080]: rfkill: WWAN hardware radio set enabled
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2204] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2204] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2205] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2206] manager: Networking is enabled by state file
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2208] settings: Loaded settings plugin: keyfile (internal)
Dec  1 03:36:02 np0005540741 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2222] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2243] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2253] dhcp: init: Using DHCP client 'internal'
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2255] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2266] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2273] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2279] device (lo): Activation: starting connection 'lo' (d32b959f-25b9-49e2-b1c9-8c743b9b7f56)
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2286] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2288] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2313] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2316] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2318] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2319] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2321] device (eth0): carrier: link connected
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2324] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2329] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2337] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2340] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2341] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2343] manager: NetworkManager state is now CONNECTING
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2344] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2349] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2351] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 03:36:02 np0005540741 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 03:36:02 np0005540741 systemd[1]: Started Network Manager.
Dec  1 03:36:02 np0005540741 systemd[1]: Reached target Network.
Dec  1 03:36:02 np0005540741 systemd[1]: Starting Network Manager Wait Online...
Dec  1 03:36:02 np0005540741 systemd[1]: Starting GSSAPI Proxy Daemon...
Dec  1 03:36:02 np0005540741 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2583] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2604] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  1 03:36:02 np0005540741 NetworkManager[858]: <info>  [1764578162.2609] device (lo): Activation: successful, device activated.
Dec  1 03:36:02 np0005540741 systemd[1]: Started GSSAPI Proxy Daemon.
Dec  1 03:36:02 np0005540741 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  1 03:36:02 np0005540741 systemd[1]: Reached target NFS client services.
Dec  1 03:36:02 np0005540741 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  1 03:36:02 np0005540741 systemd[1]: Reached target Remote File Systems.
Dec  1 03:36:02 np0005540741 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  1 03:36:03 np0005540741 NetworkManager[858]: <info>  [1764578163.0247] dhcp4 (eth0): state changed new lease, address=38.102.83.132
Dec  1 03:36:03 np0005540741 NetworkManager[858]: <info>  [1764578163.0260] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  1 03:36:03 np0005540741 NetworkManager[858]: <info>  [1764578163.0282] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 03:36:03 np0005540741 NetworkManager[858]: <info>  [1764578163.0324] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 03:36:03 np0005540741 NetworkManager[858]: <info>  [1764578163.0326] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 03:36:03 np0005540741 NetworkManager[858]: <info>  [1764578163.0330] manager: NetworkManager state is now CONNECTED_SITE
Dec  1 03:36:03 np0005540741 NetworkManager[858]: <info>  [1764578163.0334] device (eth0): Activation: successful, device activated.
Dec  1 03:36:03 np0005540741 NetworkManager[858]: <info>  [1764578163.0338] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  1 03:36:03 np0005540741 NetworkManager[858]: <info>  [1764578163.0340] manager: startup complete
Dec  1 03:36:03 np0005540741 systemd[1]: Finished Network Manager Wait Online.
Dec  1 03:36:03 np0005540741 systemd[1]: Starting Cloud-init: Network Stage...
Dec  1 03:36:03 np0005540741 cloud-init[922]: Cloud-init v. 24.4-7.el9 running 'init' at Mon, 01 Dec 2025 08:36:03 +0000. Up 8.32 seconds.
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: |  eth0  | True |        38.102.83.132         | 255.255.255.0 | global | fa:16:3e:f2:32:3e |
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fef2:323e/64 |       .       |  link  | fa:16:3e:f2:32:3e |
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec  1 03:36:03 np0005540741 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  1 03:36:04 np0005540741 cloud-init[922]: Generating public/private rsa key pair.
Dec  1 03:36:04 np0005540741 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec  1 03:36:04 np0005540741 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec  1 03:36:04 np0005540741 cloud-init[922]: The key fingerprint is:
Dec  1 03:36:04 np0005540741 cloud-init[922]: SHA256:xQhBWXSaL4ozNKJ/ph2WpJ0r09kyGA9f1WiFzpbkruk root@np0005540741.novalocal
Dec  1 03:36:04 np0005540741 cloud-init[922]: The key's randomart image is:
Dec  1 03:36:04 np0005540741 cloud-init[922]: +---[RSA 3072]----+
Dec  1 03:36:04 np0005540741 cloud-init[922]: |     .+=o..      |
Dec  1 03:36:04 np0005540741 cloud-init[922]: |      ..o*.      |
Dec  1 03:36:04 np0005540741 cloud-init[922]: |       =+=o      |
Dec  1 03:36:04 np0005540741 cloud-init[922]: |        Xo.      |
Dec  1 03:36:04 np0005540741 cloud-init[922]: |  . +  =S .      |
Dec  1 03:36:04 np0005540741 cloud-init[922]: | .o* =....       |
Dec  1 03:36:04 np0005540741 cloud-init[922]: |. .BX+.o         |
Dec  1 03:36:04 np0005540741 cloud-init[922]: | .+oX=+          |
Dec  1 03:36:04 np0005540741 cloud-init[922]: |  o*o+E          |
Dec  1 03:36:04 np0005540741 cloud-init[922]: +----[SHA256]-----+
Dec  1 03:36:04 np0005540741 cloud-init[922]: Generating public/private ecdsa key pair.
Dec  1 03:36:04 np0005540741 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec  1 03:36:04 np0005540741 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec  1 03:36:04 np0005540741 cloud-init[922]: The key fingerprint is:
Dec  1 03:36:04 np0005540741 cloud-init[922]: SHA256:XmS/4ctkAJpdn8d7Jc1bC1X6Xenqra9xpEmasUge6zE root@np0005540741.novalocal
Dec  1 03:36:04 np0005540741 cloud-init[922]: The key's randomart image is:
Dec  1 03:36:04 np0005540741 cloud-init[922]: +---[ECDSA 256]---+
Dec  1 03:36:04 np0005540741 cloud-init[922]: |                .|
Dec  1 03:36:04 np0005540741 cloud-init[922]: |               .o|
Dec  1 03:36:04 np0005540741 cloud-init[922]: |        . +   .o.|
Dec  1 03:36:04 np0005540741 cloud-init[922]: |       + = o oo+o|
Dec  1 03:36:04 np0005540741 cloud-init[922]: |      o Soo.=o+oB|
Dec  1 03:36:04 np0005540741 cloud-init[922]: |       .o.+o*+*o=|
Dec  1 03:36:04 np0005540741 cloud-init[922]: |        .E +==.+.|
Dec  1 03:36:04 np0005540741 cloud-init[922]: |        . o+..+. |
Dec  1 03:36:04 np0005540741 cloud-init[922]: |         .  o++o |
Dec  1 03:36:04 np0005540741 cloud-init[922]: +----[SHA256]-----+
Dec  1 03:36:04 np0005540741 cloud-init[922]: Generating public/private ed25519 key pair.
Dec  1 03:36:04 np0005540741 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec  1 03:36:04 np0005540741 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec  1 03:36:04 np0005540741 cloud-init[922]: The key fingerprint is:
Dec  1 03:36:04 np0005540741 cloud-init[922]: SHA256:nJTsG1uou84clkzJUTgP2QTX6XOjIVcfBfng6FGH1V8 root@np0005540741.novalocal
Dec  1 03:36:04 np0005540741 cloud-init[922]: The key's randomart image is:
Dec  1 03:36:04 np0005540741 cloud-init[922]: +--[ED25519 256]--+
Dec  1 03:36:04 np0005540741 cloud-init[922]: |      .*+. .  .*+|
Dec  1 03:36:04 np0005540741 cloud-init[922]: |      =+..o . * E|
Dec  1 03:36:04 np0005540741 cloud-init[922]: |      .++. . = =o|
Dec  1 03:36:04 np0005540741 cloud-init[922]: |     . *oo= = o o|
Dec  1 03:36:04 np0005540741 cloud-init[922]: |      + So.* o   |
Dec  1 03:36:04 np0005540741 cloud-init[922]: |     o o =. .    |
Dec  1 03:36:04 np0005540741 cloud-init[922]: |      * o        |
Dec  1 03:36:04 np0005540741 cloud-init[922]: |     + o         |
Dec  1 03:36:04 np0005540741 cloud-init[922]: |     .*.         |
Dec  1 03:36:04 np0005540741 cloud-init[922]: +----[SHA256]-----+
Dec  1 03:36:04 np0005540741 systemd[1]: Finished Cloud-init: Network Stage.
Dec  1 03:36:04 np0005540741 systemd[1]: Reached target Cloud-config availability.
Dec  1 03:36:04 np0005540741 systemd[1]: Reached target Network is Online.
Dec  1 03:36:04 np0005540741 systemd[1]: Starting Cloud-init: Config Stage...
Dec  1 03:36:04 np0005540741 systemd[1]: Starting Crash recovery kernel arming...
Dec  1 03:36:04 np0005540741 systemd[1]: Starting Notify NFS peers of a restart...
Dec  1 03:36:04 np0005540741 systemd[1]: Starting System Logging Service...
Dec  1 03:36:04 np0005540741 systemd[1]: Starting OpenSSH server daemon...
Dec  1 03:36:04 np0005540741 sm-notify[1006]: Version 2.5.4 starting
Dec  1 03:36:04 np0005540741 systemd[1]: Starting Permit User Sessions...
Dec  1 03:36:04 np0005540741 systemd[1]: Started Notify NFS peers of a restart.
Dec  1 03:36:04 np0005540741 systemd[1]: Finished Permit User Sessions.
Dec  1 03:36:04 np0005540741 systemd[1]: Started OpenSSH server daemon.
Dec  1 03:36:04 np0005540741 systemd[1]: Started Command Scheduler.
Dec  1 03:36:04 np0005540741 systemd[1]: Started Getty on tty1.
Dec  1 03:36:04 np0005540741 systemd[1]: Started Serial Getty on ttyS0.
Dec  1 03:36:04 np0005540741 systemd[1]: Reached target Login Prompts.
Dec  1 03:36:04 np0005540741 rsyslogd[1007]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1007" x-info="https://www.rsyslog.com"] start
Dec  1 03:36:04 np0005540741 rsyslogd[1007]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec  1 03:36:04 np0005540741 systemd[1]: Started System Logging Service.
Dec  1 03:36:04 np0005540741 systemd[1]: Reached target Multi-User System.
Dec  1 03:36:04 np0005540741 systemd[1]: Starting Record Runlevel Change in UTMP...
Dec  1 03:36:04 np0005540741 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec  1 03:36:04 np0005540741 systemd[1]: Finished Record Runlevel Change in UTMP.
Dec  1 03:36:04 np0005540741 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 03:36:04 np0005540741 kdumpctl[1019]: kdump: No kdump initial ramdisk found.
Dec  1 03:36:04 np0005540741 kdumpctl[1019]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Dec  1 03:36:04 np0005540741 cloud-init[1144]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Mon, 01 Dec 2025 08:36:04 +0000. Up 9.87 seconds.
Dec  1 03:36:05 np0005540741 systemd[1]: Finished Cloud-init: Config Stage.
Dec  1 03:36:05 np0005540741 systemd[1]: Starting Cloud-init: Final Stage...
Dec  1 03:36:05 np0005540741 dracut[1267]: dracut-057-102.git20250818.el9
Dec  1 03:36:05 np0005540741 dracut[1269]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Dec  1 03:36:05 np0005540741 cloud-init[1300]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Mon, 01 Dec 2025 08:36:05 +0000. Up 10.27 seconds.
Dec  1 03:36:05 np0005540741 cloud-init[1339]: #############################################################
Dec  1 03:36:05 np0005540741 cloud-init[1343]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec  1 03:36:05 np0005540741 cloud-init[1349]: 256 SHA256:XmS/4ctkAJpdn8d7Jc1bC1X6Xenqra9xpEmasUge6zE root@np0005540741.novalocal (ECDSA)
Dec  1 03:36:05 np0005540741 cloud-init[1356]: 256 SHA256:nJTsG1uou84clkzJUTgP2QTX6XOjIVcfBfng6FGH1V8 root@np0005540741.novalocal (ED25519)
Dec  1 03:36:05 np0005540741 cloud-init[1360]: 3072 SHA256:xQhBWXSaL4ozNKJ/ph2WpJ0r09kyGA9f1WiFzpbkruk root@np0005540741.novalocal (RSA)
Dec  1 03:36:05 np0005540741 cloud-init[1361]: -----END SSH HOST KEY FINGERPRINTS-----
Dec  1 03:36:05 np0005540741 cloud-init[1363]: #############################################################
Dec  1 03:36:05 np0005540741 cloud-init[1300]: Cloud-init v. 24.4-7.el9 finished at Mon, 01 Dec 2025 08:36:05 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.45 seconds
Dec  1 03:36:05 np0005540741 systemd[1]: Finished Cloud-init: Final Stage.
Dec  1 03:36:05 np0005540741 systemd[1]: Reached target Cloud-init target.
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  1 03:36:05 np0005540741 dracut[1269]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: memstrack is not available
Dec  1 03:36:06 np0005540741 dracut[1269]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  1 03:36:06 np0005540741 dracut[1269]: memstrack is not available
Dec  1 03:36:06 np0005540741 dracut[1269]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  1 03:36:06 np0005540741 dracut[1269]: *** Including module: systemd ***
Dec  1 03:36:06 np0005540741 dracut[1269]: *** Including module: fips ***
Dec  1 03:36:07 np0005540741 dracut[1269]: *** Including module: systemd-initrd ***
Dec  1 03:36:07 np0005540741 dracut[1269]: *** Including module: i18n ***
Dec  1 03:36:07 np0005540741 dracut[1269]: *** Including module: drm ***
Dec  1 03:36:07 np0005540741 dracut[1269]: *** Including module: prefixdevname ***
Dec  1 03:36:07 np0005540741 dracut[1269]: *** Including module: kernel-modules ***
Dec  1 03:36:07 np0005540741 kernel: block vda: the capability attribute has been deprecated.
Dec  1 03:36:08 np0005540741 chronyd[791]: Selected source 54.39.23.64 (2.centos.pool.ntp.org)
Dec  1 03:36:08 np0005540741 chronyd[791]: System clock TAI offset set to 37 seconds
Dec  1 03:36:08 np0005540741 dracut[1269]: *** Including module: kernel-modules-extra ***
Dec  1 03:36:08 np0005540741 dracut[1269]: *** Including module: qemu ***
Dec  1 03:36:08 np0005540741 dracut[1269]: *** Including module: fstab-sys ***
Dec  1 03:36:08 np0005540741 dracut[1269]: *** Including module: rootfs-block ***
Dec  1 03:36:08 np0005540741 dracut[1269]: *** Including module: terminfo ***
Dec  1 03:36:08 np0005540741 dracut[1269]: *** Including module: udev-rules ***
Dec  1 03:36:09 np0005540741 dracut[1269]: Skipping udev rule: 91-permissions.rules
Dec  1 03:36:09 np0005540741 dracut[1269]: Skipping udev rule: 80-drivers-modprobe.rules
Dec  1 03:36:09 np0005540741 dracut[1269]: *** Including module: virtiofs ***
Dec  1 03:36:09 np0005540741 dracut[1269]: *** Including module: dracut-systemd ***
Dec  1 03:36:09 np0005540741 dracut[1269]: *** Including module: usrmount ***
Dec  1 03:36:09 np0005540741 dracut[1269]: *** Including module: base ***
Dec  1 03:36:09 np0005540741 dracut[1269]: *** Including module: fs-lib ***
Dec  1 03:36:09 np0005540741 dracut[1269]: *** Including module: kdumpbase ***
Dec  1 03:36:10 np0005540741 dracut[1269]: *** Including module: microcode_ctl-fw_dir_override ***
Dec  1 03:36:10 np0005540741 dracut[1269]:  microcode_ctl module: mangling fw_dir
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: configuration "intel" is ignored
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec  1 03:36:10 np0005540741 dracut[1269]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec  1 03:36:10 np0005540741 dracut[1269]: *** Including module: openssl ***
Dec  1 03:36:10 np0005540741 dracut[1269]: *** Including module: shutdown ***
Dec  1 03:36:10 np0005540741 dracut[1269]: *** Including module: squash ***
Dec  1 03:36:10 np0005540741 dracut[1269]: *** Including modules done ***
Dec  1 03:36:10 np0005540741 dracut[1269]: *** Installing kernel module dependencies ***
Dec  1 03:36:11 np0005540741 irqbalance[783]: Cannot change IRQ 25 affinity: Operation not permitted
Dec  1 03:36:11 np0005540741 irqbalance[783]: IRQ 25 affinity is now unmanaged
Dec  1 03:36:11 np0005540741 irqbalance[783]: Cannot change IRQ 31 affinity: Operation not permitted
Dec  1 03:36:11 np0005540741 irqbalance[783]: IRQ 31 affinity is now unmanaged
Dec  1 03:36:11 np0005540741 irqbalance[783]: Cannot change IRQ 28 affinity: Operation not permitted
Dec  1 03:36:11 np0005540741 irqbalance[783]: IRQ 28 affinity is now unmanaged
Dec  1 03:36:11 np0005540741 irqbalance[783]: Cannot change IRQ 32 affinity: Operation not permitted
Dec  1 03:36:11 np0005540741 irqbalance[783]: IRQ 32 affinity is now unmanaged
Dec  1 03:36:11 np0005540741 irqbalance[783]: Cannot change IRQ 30 affinity: Operation not permitted
Dec  1 03:36:11 np0005540741 irqbalance[783]: IRQ 30 affinity is now unmanaged
Dec  1 03:36:11 np0005540741 irqbalance[783]: Cannot change IRQ 29 affinity: Operation not permitted
Dec  1 03:36:11 np0005540741 irqbalance[783]: IRQ 29 affinity is now unmanaged
Dec  1 03:36:11 np0005540741 dracut[1269]: *** Installing kernel module dependencies done ***
Dec  1 03:36:11 np0005540741 dracut[1269]: *** Resolving executable dependencies ***
Dec  1 03:36:13 np0005540741 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 03:36:13 np0005540741 dracut[1269]: *** Resolving executable dependencies done ***
Dec  1 03:36:13 np0005540741 dracut[1269]: *** Generating early-microcode cpio image ***
Dec  1 03:36:13 np0005540741 dracut[1269]: *** Store current command line parameters ***
Dec  1 03:36:13 np0005540741 dracut[1269]: Stored kernel commandline:
Dec  1 03:36:13 np0005540741 dracut[1269]: No dracut internal kernel commandline stored in the initramfs
Dec  1 03:36:13 np0005540741 dracut[1269]: *** Install squash loader ***
Dec  1 03:36:14 np0005540741 dracut[1269]: *** Squashing the files inside the initramfs ***
Dec  1 03:36:15 np0005540741 dracut[1269]: *** Squashing the files inside the initramfs done ***
Dec  1 03:36:15 np0005540741 dracut[1269]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Dec  1 03:36:15 np0005540741 dracut[1269]: *** Hardlinking files ***
Dec  1 03:36:15 np0005540741 dracut[1269]: *** Hardlinking files done ***
Dec  1 03:36:15 np0005540741 dracut[1269]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Dec  1 03:36:16 np0005540741 kdumpctl[1019]: kdump: kexec: loaded kdump kernel
Dec  1 03:36:16 np0005540741 kdumpctl[1019]: kdump: Starting kdump: [OK]
Dec  1 03:36:16 np0005540741 systemd[1]: Finished Crash recovery kernel arming.
Dec  1 03:36:16 np0005540741 systemd[1]: Startup finished in 1.857s (kernel) + 2.605s (initrd) + 17.040s (userspace) = 21.503s.
Dec  1 03:36:32 np0005540741 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  1 03:36:34 np0005540741 systemd[1]: Created slice User Slice of UID 1000.
Dec  1 03:36:34 np0005540741 systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec  1 03:36:34 np0005540741 systemd-logind[788]: New session 1 of user zuul.
Dec  1 03:36:34 np0005540741 systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec  1 03:36:34 np0005540741 systemd[1]: Starting User Manager for UID 1000...
Dec  1 03:36:34 np0005540741 systemd[4302]: Queued start job for default target Main User Target.
Dec  1 03:36:34 np0005540741 systemd[4302]: Created slice User Application Slice.
Dec  1 03:36:34 np0005540741 systemd[4302]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  1 03:36:34 np0005540741 systemd[4302]: Started Daily Cleanup of User's Temporary Directories.
Dec  1 03:36:34 np0005540741 systemd[4302]: Reached target Paths.
Dec  1 03:36:34 np0005540741 systemd[4302]: Reached target Timers.
Dec  1 03:36:34 np0005540741 systemd[4302]: Starting D-Bus User Message Bus Socket...
Dec  1 03:36:34 np0005540741 systemd[4302]: Starting Create User's Volatile Files and Directories...
Dec  1 03:36:34 np0005540741 systemd[4302]: Finished Create User's Volatile Files and Directories.
Dec  1 03:36:34 np0005540741 systemd[4302]: Listening on D-Bus User Message Bus Socket.
Dec  1 03:36:34 np0005540741 systemd[4302]: Reached target Sockets.
Dec  1 03:36:34 np0005540741 systemd[4302]: Reached target Basic System.
Dec  1 03:36:34 np0005540741 systemd[4302]: Reached target Main User Target.
Dec  1 03:36:34 np0005540741 systemd[4302]: Startup finished in 132ms.
Dec  1 03:36:34 np0005540741 systemd[1]: Started User Manager for UID 1000.
Dec  1 03:36:34 np0005540741 systemd[1]: Started Session 1 of User zuul.
Dec  1 03:36:34 np0005540741 python3[4384]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 03:36:37 np0005540741 python3[4412]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 03:36:42 np0005540741 python3[4470]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 03:36:43 np0005540741 python3[4510]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec  1 03:36:45 np0005540741 python3[4536]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDWBn8adtzj5VQSMTGCbPZbb2t89ydBfmt2xnyrzcW5lk8iyDmU7stlRl54OXCT92s+i4aEedo3N8/84mj25AaJuCIy6nrmTOHyLnfcjuosGYCnwHzCe19VyrE/wDLd61C0JtSfnBmMiz2v89KKX1dwLfQ8rY6+SsNUONSbNacinrODyDCJAZX+8BD0WWCHAFXp1sJrMs03LwF6slZnK38R/nNniLlW5wrtwmsinG8g3TYTMxhnoleJgzOOOdLLN17z+IyHtpK/U82kBeP3113pUfJt+oNS/yFZJvzATFsc5sbQwPqscJ/tuge5khq+PAMcFQnfPLwl8sWM+bmMT/nybM1cGGvMR9sodRHwRFNoluvDjYHvT/sTGItVocsh+4rwmhxxVv7eWZhxPcChUvuOA1/hlWYHlie6GhVjWn2363noxAZXR4xarW++iECASNQbL03ddJYNKsQoaUGwrCG7uJcrMVgNYCsQScqEq7uh0Kw75SGqU6KbKJgfZ/N5Gss= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:36:45 np0005540741 python3[4560]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:36:46 np0005540741 python3[4659]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:36:46 np0005540741 python3[4730]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764578206.070604-207-198921412613347/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=b2ce5c11e0624cf7b75dcf49498569ff_id_rsa follow=False checksum=feda567e9354865c74d371505b3546f00914f204 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:36:47 np0005540741 python3[4853]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:36:47 np0005540741 python3[4924]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764578207.1587334-240-59453237360859/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=b2ce5c11e0624cf7b75dcf49498569ff_id_rsa.pub follow=False checksum=9e3220d039ae9e5c26ede6336dc219a70d0b7eba backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:36:49 np0005540741 python3[4972]: ansible-ping Invoked with data=pong
Dec  1 03:36:50 np0005540741 python3[4996]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 03:36:51 np0005540741 python3[5054]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec  1 03:36:52 np0005540741 python3[5086]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:36:52 np0005540741 python3[5110]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:36:53 np0005540741 python3[5134]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:36:53 np0005540741 python3[5158]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:36:53 np0005540741 python3[5182]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:36:54 np0005540741 python3[5206]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:36:55 np0005540741 python3[5232]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:36:56 np0005540741 python3[5310]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:36:56 np0005540741 python3[5383]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764578215.640814-21-243403188566425/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:36:57 np0005540741 python3[5431]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:36:57 np0005540741 python3[5455]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:36:57 np0005540741 python3[5479]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:36:57 np0005540741 python3[5503]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:36:58 np0005540741 python3[5527]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:36:58 np0005540741 python3[5551]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:36:58 np0005540741 python3[5575]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:36:59 np0005540741 python3[5599]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:36:59 np0005540741 python3[5623]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:36:59 np0005540741 python3[5647]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:36:59 np0005540741 python3[5671]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:00 np0005540741 python3[5695]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:00 np0005540741 python3[5719]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:00 np0005540741 python3[5743]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:00 np0005540741 python3[5767]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:01 np0005540741 python3[5791]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:01 np0005540741 python3[5815]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:01 np0005540741 python3[5839]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:02 np0005540741 python3[5863]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:02 np0005540741 python3[5887]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:02 np0005540741 python3[5911]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:02 np0005540741 python3[5935]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:03 np0005540741 python3[5959]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:03 np0005540741 python3[5983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:03 np0005540741 python3[6007]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:03 np0005540741 python3[6031]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:37:06 np0005540741 python3[6057]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  1 03:37:06 np0005540741 systemd[1]: Starting Time & Date Service...
Dec  1 03:37:06 np0005540741 systemd[1]: Started Time & Date Service.
Dec  1 03:37:06 np0005540741 systemd-timedated[6059]: Changed time zone to 'UTC' (UTC).
Dec  1 03:37:07 np0005540741 python3[6088]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:37:07 np0005540741 python3[6164]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:37:07 np0005540741 python3[6235]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764578227.380935-153-225488970151633/source _original_basename=tmp_qz7o2l_ follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:37:08 np0005540741 python3[6335]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:37:08 np0005540741 python3[6406]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764578228.1802669-183-160260386879000/source _original_basename=tmphofnz4o9 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:37:09 np0005540741 python3[6508]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:37:09 np0005540741 python3[6581]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764578229.2338011-231-239681911058611/source _original_basename=tmpd97heb_4 follow=False checksum=2e193f101b911db5e638a5fc33120ba1c99c8f88 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:37:10 np0005540741 python3[6629]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 03:37:10 np0005540741 python3[6655]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 03:37:11 np0005540741 python3[6735]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:37:11 np0005540741 python3[6808]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764578230.8271308-273-28826554359574/source _original_basename=tmpj4g8imht follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:37:12 np0005540741 python3[6859]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-d947-2e6a-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 03:37:12 np0005540741 python3[6887]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-d947-2e6a-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec  1 03:37:13 np0005540741 python3[6916]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:37:33 np0005540741 python3[6942]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:37:36 np0005540741 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  1 03:38:18 np0005540741 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  1 03:38:18 np0005540741 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec  1 03:38:18 np0005540741 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec  1 03:38:18 np0005540741 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec  1 03:38:18 np0005540741 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec  1 03:38:18 np0005540741 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec  1 03:38:18 np0005540741 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec  1 03:38:18 np0005540741 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec  1 03:38:18 np0005540741 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec  1 03:38:18 np0005540741 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec  1 03:38:18 np0005540741 NetworkManager[858]: <info>  [1764578298.9251] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  1 03:38:18 np0005540741 systemd-udevd[6946]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 03:38:18 np0005540741 NetworkManager[858]: <info>  [1764578298.9464] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 03:38:18 np0005540741 NetworkManager[858]: <info>  [1764578298.9512] settings: (eth1): created default wired connection 'Wired connection 1'
Dec  1 03:38:18 np0005540741 NetworkManager[858]: <info>  [1764578298.9522] device (eth1): carrier: link connected
Dec  1 03:38:18 np0005540741 NetworkManager[858]: <info>  [1764578298.9526] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  1 03:38:18 np0005540741 NetworkManager[858]: <info>  [1764578298.9539] policy: auto-activating connection 'Wired connection 1' (c6b7d4f6-4237-35c7-90cb-622f3da1d185)
Dec  1 03:38:18 np0005540741 NetworkManager[858]: <info>  [1764578298.9548] device (eth1): Activation: starting connection 'Wired connection 1' (c6b7d4f6-4237-35c7-90cb-622f3da1d185)
Dec  1 03:38:18 np0005540741 NetworkManager[858]: <info>  [1764578298.9550] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 03:38:18 np0005540741 NetworkManager[858]: <info>  [1764578298.9558] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 03:38:18 np0005540741 NetworkManager[858]: <info>  [1764578298.9567] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 03:38:18 np0005540741 NetworkManager[858]: <info>  [1764578298.9577] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  1 03:38:20 np0005540741 python3[6972]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-bd7e-bab5-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 03:38:30 np0005540741 python3[7052]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:38:30 np0005540741 python3[7125]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764578309.7922854-102-9430775054342/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=351d510ee20d95814e9a8640058fdb7b2e7669b8 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:38:31 np0005540741 python3[7175]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 03:38:31 np0005540741 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  1 03:38:31 np0005540741 systemd[1]: Stopped Network Manager Wait Online.
Dec  1 03:38:31 np0005540741 systemd[1]: Stopping Network Manager Wait Online...
Dec  1 03:38:31 np0005540741 systemd[1]: Stopping Network Manager...
Dec  1 03:38:31 np0005540741 NetworkManager[858]: <info>  [1764578311.3104] caught SIGTERM, shutting down normally.
Dec  1 03:38:31 np0005540741 NetworkManager[858]: <info>  [1764578311.3116] dhcp4 (eth0): canceled DHCP transaction
Dec  1 03:38:31 np0005540741 NetworkManager[858]: <info>  [1764578311.3117] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 03:38:31 np0005540741 NetworkManager[858]: <info>  [1764578311.3117] dhcp4 (eth0): state changed no lease
Dec  1 03:38:31 np0005540741 NetworkManager[858]: <info>  [1764578311.3119] manager: NetworkManager state is now CONNECTING
Dec  1 03:38:31 np0005540741 NetworkManager[858]: <info>  [1764578311.3233] dhcp4 (eth1): canceled DHCP transaction
Dec  1 03:38:31 np0005540741 NetworkManager[858]: <info>  [1764578311.3233] dhcp4 (eth1): state changed no lease
Dec  1 03:38:31 np0005540741 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 03:38:31 np0005540741 NetworkManager[858]: <info>  [1764578311.3268] exiting (success)
Dec  1 03:38:31 np0005540741 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 03:38:31 np0005540741 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  1 03:38:31 np0005540741 systemd[1]: Stopped Network Manager.
Dec  1 03:38:31 np0005540741 systemd[1]: NetworkManager.service: Consumed 1.022s CPU time, 10.1M memory peak.
Dec  1 03:38:31 np0005540741 systemd[1]: Starting Network Manager...
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.3769] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:fbf967f0-219c-4ceb-b589-3e4f3756d2b4)
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.3772] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.3828] manager[0x5620120dd070]: monitoring kernel firmware directory '/lib/firmware'.
Dec  1 03:38:31 np0005540741 systemd[1]: Starting Hostname Service...
Dec  1 03:38:31 np0005540741 systemd[1]: Started Hostname Service.
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.4893] hostname: hostname: using hostnamed
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.4897] hostname: static hostname changed from (none) to "np0005540741.novalocal"
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.4905] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.4910] manager[0x5620120dd070]: rfkill: Wi-Fi hardware radio set enabled
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.4911] manager[0x5620120dd070]: rfkill: WWAN hardware radio set enabled
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.4955] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.4955] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.4956] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.4957] manager: Networking is enabled by state file
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.4961] settings: Loaded settings plugin: keyfile (internal)
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.4968] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5011] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5026] dhcp: init: Using DHCP client 'internal'
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5031] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5038] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5046] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5059] device (lo): Activation: starting connection 'lo' (d32b959f-25b9-49e2-b1c9-8c743b9b7f56)
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5069] device (eth0): carrier: link connected
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5075] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5083] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5084] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5095] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5106] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5116] device (eth1): carrier: link connected
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5122] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5129] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (c6b7d4f6-4237-35c7-90cb-622f3da1d185) (indicated)
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5129] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5137] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5151] device (eth1): Activation: starting connection 'Wired connection 1' (c6b7d4f6-4237-35c7-90cb-622f3da1d185)
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5160] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  1 03:38:31 np0005540741 systemd[1]: Started Network Manager.
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5168] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5173] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5176] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5180] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5187] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5190] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5194] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5198] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5207] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5212] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5226] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5229] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5252] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5261] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5269] device (lo): Activation: successful, device activated.
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5279] dhcp4 (eth0): state changed new lease, address=38.102.83.132
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5290] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  1 03:38:31 np0005540741 systemd[1]: Starting Network Manager Wait Online...
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5432] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5469] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5472] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5475] manager: NetworkManager state is now CONNECTED_SITE
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5478] device (eth0): Activation: successful, device activated.
Dec  1 03:38:31 np0005540741 NetworkManager[7186]: <info>  [1764578311.5483] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  1 03:38:31 np0005540741 python3[7259]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-bd7e-bab5-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 03:38:41 np0005540741 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 03:39:01 np0005540741 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  1 03:39:06 np0005540741 systemd[4302]: Starting Mark boot as successful...
Dec  1 03:39:06 np0005540741 systemd[4302]: Finished Mark boot as successful.
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0314] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  1 03:39:17 np0005540741 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 03:39:17 np0005540741 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0654] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0657] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0666] device (eth1): Activation: successful, device activated.
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0671] manager: startup complete
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0676] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <warn>  [1764578357.0680] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0687] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec  1 03:39:17 np0005540741 systemd[1]: Finished Network Manager Wait Online.
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0773] dhcp4 (eth1): canceled DHCP transaction
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0774] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0774] dhcp4 (eth1): state changed no lease
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0786] policy: auto-activating connection 'ci-private-network' (178f6c72-5a8d-5d35-b5e8-6b22bb1c98c8)
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0790] device (eth1): Activation: starting connection 'ci-private-network' (178f6c72-5a8d-5d35-b5e8-6b22bb1c98c8)
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0790] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0792] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0796] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0802] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0830] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0831] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 03:39:17 np0005540741 NetworkManager[7186]: <info>  [1764578357.0834] device (eth1): Activation: successful, device activated.
Dec  1 03:39:27 np0005540741 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 03:39:31 np0005540741 systemd-logind[788]: Session 1 logged out. Waiting for processes to exit.
Dec  1 03:39:33 np0005540741 systemd-logind[788]: New session 3 of user zuul.
Dec  1 03:39:33 np0005540741 systemd[1]: Started Session 3 of User zuul.
Dec  1 03:39:34 np0005540741 python3[7369]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:39:34 np0005540741 python3[7442]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764578373.9976506-267-187034409188700/source _original_basename=tmpcmm2z9eo follow=False checksum=72146e9f9cee0111e1af10d9a8bd93298758ed4f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:39:36 np0005540741 systemd[1]: session-3.scope: Deactivated successfully.
Dec  1 03:39:36 np0005540741 systemd-logind[788]: Session 3 logged out. Waiting for processes to exit.
Dec  1 03:39:36 np0005540741 systemd-logind[788]: Removed session 3.
Dec  1 03:42:06 np0005540741 systemd[4302]: Created slice User Background Tasks Slice.
Dec  1 03:42:06 np0005540741 systemd[4302]: Starting Cleanup of User's Temporary Files and Directories...
Dec  1 03:42:06 np0005540741 systemd[4302]: Finished Cleanup of User's Temporary Files and Directories.
Dec  1 03:46:08 np0005540741 systemd-logind[788]: New session 4 of user zuul.
Dec  1 03:46:08 np0005540741 systemd[1]: Started Session 4 of User zuul.
Dec  1 03:46:08 np0005540741 python3[7501]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-360c-38fc-000000001cd4-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 03:46:08 np0005540741 python3[7529]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:46:09 np0005540741 python3[7556]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:46:09 np0005540741 python3[7582]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:46:09 np0005540741 python3[7608]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:46:10 np0005540741 python3[7634]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:46:10 np0005540741 python3[7712]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:46:11 np0005540741 python3[7785]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764578770.6514866-479-277063520978790/source _original_basename=tmpwk3dlxpv follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:46:12 np0005540741 python3[7835]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 03:46:12 np0005540741 systemd[1]: Reloading.
Dec  1 03:46:12 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 03:46:13 np0005540741 python3[7891]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec  1 03:46:14 np0005540741 python3[7917]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 03:46:14 np0005540741 python3[7945]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 03:46:14 np0005540741 python3[7973]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 03:46:15 np0005540741 python3[8001]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 03:46:15 np0005540741 python3[8028]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-360c-38fc-000000001cdb-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 03:46:16 np0005540741 python3[8058]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  1 03:46:18 np0005540741 systemd-logind[788]: Session 4 logged out. Waiting for processes to exit.
Dec  1 03:46:18 np0005540741 systemd[1]: session-4.scope: Deactivated successfully.
Dec  1 03:46:18 np0005540741 systemd[1]: session-4.scope: Consumed 4.462s CPU time.
Dec  1 03:46:18 np0005540741 systemd-logind[788]: Removed session 4.
Dec  1 03:46:19 np0005540741 systemd-logind[788]: New session 5 of user zuul.
Dec  1 03:46:19 np0005540741 systemd[1]: Started Session 5 of User zuul.
Dec  1 03:46:20 np0005540741 python3[8092]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  1 03:46:34 np0005540741 kernel: SELinux:  Converting 385 SID table entries...
Dec  1 03:46:34 np0005540741 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 03:46:34 np0005540741 kernel: SELinux:  policy capability open_perms=1
Dec  1 03:46:34 np0005540741 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 03:46:34 np0005540741 kernel: SELinux:  policy capability always_check_network=0
Dec  1 03:46:34 np0005540741 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 03:46:34 np0005540741 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 03:46:34 np0005540741 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 03:46:43 np0005540741 kernel: SELinux:  Converting 385 SID table entries...
Dec  1 03:46:43 np0005540741 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 03:46:43 np0005540741 kernel: SELinux:  policy capability open_perms=1
Dec  1 03:46:43 np0005540741 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 03:46:43 np0005540741 kernel: SELinux:  policy capability always_check_network=0
Dec  1 03:46:43 np0005540741 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 03:46:43 np0005540741 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 03:46:43 np0005540741 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 03:46:51 np0005540741 kernel: SELinux:  Converting 385 SID table entries...
Dec  1 03:46:52 np0005540741 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 03:46:52 np0005540741 kernel: SELinux:  policy capability open_perms=1
Dec  1 03:46:52 np0005540741 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 03:46:52 np0005540741 kernel: SELinux:  policy capability always_check_network=0
Dec  1 03:46:52 np0005540741 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 03:46:52 np0005540741 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 03:46:52 np0005540741 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 03:46:53 np0005540741 setsebool[8158]: The virt_use_nfs policy boolean was changed to 1 by root
Dec  1 03:46:53 np0005540741 setsebool[8158]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec  1 03:47:03 np0005540741 kernel: SELinux:  Converting 388 SID table entries...
Dec  1 03:47:03 np0005540741 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 03:47:03 np0005540741 kernel: SELinux:  policy capability open_perms=1
Dec  1 03:47:03 np0005540741 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 03:47:03 np0005540741 kernel: SELinux:  policy capability always_check_network=0
Dec  1 03:47:03 np0005540741 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 03:47:03 np0005540741 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 03:47:03 np0005540741 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 03:47:21 np0005540741 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  1 03:47:21 np0005540741 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 03:47:22 np0005540741 systemd[1]: Starting man-db-cache-update.service...
Dec  1 03:47:22 np0005540741 systemd[1]: Reloading.
Dec  1 03:47:22 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 03:47:22 np0005540741 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 03:47:25 np0005540741 python3[11844]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-7826-38cf-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 03:47:26 np0005540741 kernel: evm: overlay not supported
Dec  1 03:47:26 np0005540741 systemd[4302]: Starting D-Bus User Message Bus...
Dec  1 03:47:26 np0005540741 dbus-broker-launch[12638]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec  1 03:47:26 np0005540741 dbus-broker-launch[12638]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec  1 03:47:26 np0005540741 systemd[4302]: Started D-Bus User Message Bus.
Dec  1 03:47:26 np0005540741 dbus-broker-lau[12638]: Ready
Dec  1 03:47:26 np0005540741 systemd[4302]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  1 03:47:26 np0005540741 systemd[4302]: Created slice Slice /user.
Dec  1 03:47:26 np0005540741 systemd[4302]: podman-12543.scope: unit configures an IP firewall, but not running as root.
Dec  1 03:47:26 np0005540741 systemd[4302]: (This warning is only shown for the first unit using IP firewalling.)
Dec  1 03:47:26 np0005540741 systemd[4302]: Started podman-12543.scope.
Dec  1 03:47:27 np0005540741 systemd[4302]: Started podman-pause-2929e1b4.scope.
Dec  1 03:47:27 np0005540741 python3[13188]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.103:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.103:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:47:27 np0005540741 python3[13188]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec  1 03:47:28 np0005540741 systemd[1]: session-5.scope: Deactivated successfully.
Dec  1 03:47:28 np0005540741 systemd[1]: session-5.scope: Consumed 58.913s CPU time.
Dec  1 03:47:28 np0005540741 systemd-logind[788]: Session 5 logged out. Waiting for processes to exit.
Dec  1 03:47:28 np0005540741 systemd-logind[788]: Removed session 5.
Dec  1 03:47:51 np0005540741 irqbalance[783]: Cannot change IRQ 27 affinity: Operation not permitted
Dec  1 03:47:51 np0005540741 irqbalance[783]: IRQ 27 affinity is now unmanaged
Dec  1 03:47:51 np0005540741 systemd-logind[788]: New session 6 of user zuul.
Dec  1 03:47:51 np0005540741 systemd[1]: Started Session 6 of User zuul.
Dec  1 03:47:51 np0005540741 python3[22316]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD1SOTRFLynXDwtgjr8Jpb8q9uXMxfBHXlGklyuTKwPjDPXnAOML+Jen7YniVKHCDh1af/hIsfBphF1Trq0+ElA= zuul@np0005540740.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:47:52 np0005540741 python3[22480]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD1SOTRFLynXDwtgjr8Jpb8q9uXMxfBHXlGklyuTKwPjDPXnAOML+Jen7YniVKHCDh1af/hIsfBphF1Trq0+ElA= zuul@np0005540740.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:47:53 np0005540741 python3[22782]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005540741.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec  1 03:47:53 np0005540741 python3[22991]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD1SOTRFLynXDwtgjr8Jpb8q9uXMxfBHXlGklyuTKwPjDPXnAOML+Jen7YniVKHCDh1af/hIsfBphF1Trq0+ElA= zuul@np0005540740.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 03:47:54 np0005540741 python3[23244]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:47:54 np0005540741 python3[23486]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764578873.7994027-135-188411347745223/source _original_basename=tmp4km6e_s3 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:47:55 np0005540741 python3[23766]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Dec  1 03:47:55 np0005540741 systemd[1]: Starting Hostname Service...
Dec  1 03:47:55 np0005540741 systemd[1]: Started Hostname Service.
Dec  1 03:47:55 np0005540741 systemd-hostnamed[23864]: Changed pretty hostname to 'compute-0'
Dec  1 03:47:55 np0005540741 systemd-hostnamed[23864]: Hostname set to <compute-0> (static)
Dec  1 03:47:55 np0005540741 NetworkManager[7186]: <info>  [1764578875.6998] hostname: static hostname changed from "np0005540741.novalocal" to "compute-0"
Dec  1 03:47:55 np0005540741 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 03:47:55 np0005540741 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 03:47:56 np0005540741 systemd[1]: session-6.scope: Deactivated successfully.
Dec  1 03:47:56 np0005540741 systemd[1]: session-6.scope: Consumed 2.683s CPU time.
Dec  1 03:47:56 np0005540741 systemd-logind[788]: Session 6 logged out. Waiting for processes to exit.
Dec  1 03:47:56 np0005540741 systemd-logind[788]: Removed session 6.
Dec  1 03:48:05 np0005540741 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 03:48:13 np0005540741 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 03:48:13 np0005540741 systemd[1]: Finished man-db-cache-update.service.
Dec  1 03:48:13 np0005540741 systemd[1]: man-db-cache-update.service: Consumed 1min 3.072s CPU time.
Dec  1 03:48:13 np0005540741 systemd[1]: run-rb558ef0e182540369dfba52fc1496cf3.service: Deactivated successfully.
Dec  1 03:48:25 np0005540741 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  1 03:51:06 np0005540741 systemd[1]: Starting Cleanup of Temporary Directories...
Dec  1 03:51:06 np0005540741 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec  1 03:51:07 np0005540741 systemd[1]: Finished Cleanup of Temporary Directories.
Dec  1 03:51:07 np0005540741 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec  1 03:51:27 np0005540741 systemd-logind[788]: New session 7 of user zuul.
Dec  1 03:51:27 np0005540741 systemd[1]: Started Session 7 of User zuul.
Dec  1 03:51:28 np0005540741 python3[30041]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 03:51:29 np0005540741 python3[30157]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:51:30 np0005540741 python3[30230]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764579089.404085-33574-56968449648190/source mode=0755 _original_basename=delorean.repo follow=False checksum=39c885eb875fd03e010d1b0454241c26b121dfb2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:51:30 np0005540741 python3[30256]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:51:30 np0005540741 python3[30329]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764579089.404085-33574-56968449648190/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:51:31 np0005540741 python3[30355]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:51:31 np0005540741 python3[30428]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764579089.404085-33574-56968449648190/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:51:31 np0005540741 python3[30454]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:51:31 np0005540741 python3[30527]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764579089.404085-33574-56968449648190/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:51:32 np0005540741 python3[30553]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:51:32 np0005540741 python3[30626]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764579089.404085-33574-56968449648190/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:51:32 np0005540741 python3[30652]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:51:33 np0005540741 python3[30725]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764579089.404085-33574-56968449648190/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:51:33 np0005540741 python3[30751]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 03:51:33 np0005540741 python3[30824]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764579089.404085-33574-56968449648190/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6e18e2038d54303b4926db53c0b6cced515a9151 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 03:51:44 np0005540741 python3[30882]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 03:56:44 np0005540741 systemd[1]: session-7.scope: Deactivated successfully.
Dec  1 03:56:44 np0005540741 systemd-logind[788]: Session 7 logged out. Waiting for processes to exit.
Dec  1 03:56:44 np0005540741 systemd[1]: session-7.scope: Consumed 4.867s CPU time.
Dec  1 03:56:44 np0005540741 systemd-logind[788]: Removed session 7.
Dec  1 04:04:11 np0005540741 systemd-logind[788]: New session 8 of user zuul.
Dec  1 04:04:11 np0005540741 systemd[1]: Started Session 8 of User zuul.
Dec  1 04:04:12 np0005540741 python3.9[31058]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:04:13 np0005540741 python3.9[31239]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:04:21 np0005540741 systemd[1]: session-8.scope: Deactivated successfully.
Dec  1 04:04:21 np0005540741 systemd[1]: session-8.scope: Consumed 7.872s CPU time.
Dec  1 04:04:21 np0005540741 systemd-logind[788]: Session 8 logged out. Waiting for processes to exit.
Dec  1 04:04:21 np0005540741 systemd-logind[788]: Removed session 8.
Dec  1 04:04:37 np0005540741 systemd-logind[788]: New session 9 of user zuul.
Dec  1 04:04:37 np0005540741 systemd[1]: Started Session 9 of User zuul.
Dec  1 04:04:38 np0005540741 python3.9[31449]: ansible-ansible.legacy.ping Invoked with data=pong
Dec  1 04:04:39 np0005540741 python3.9[31623]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:04:40 np0005540741 python3.9[31775]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:04:41 np0005540741 irqbalance[783]: Cannot change IRQ 26 affinity: Operation not permitted
Dec  1 04:04:41 np0005540741 irqbalance[783]: IRQ 26 affinity is now unmanaged
Dec  1 04:04:41 np0005540741 python3.9[31928]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:04:42 np0005540741 python3.9[32080]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:04:43 np0005540741 python3.9[32232]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:04:43 np0005540741 python3.9[32355]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764579882.7072837-73-140268283855490/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:04:44 np0005540741 python3.9[32507]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:04:45 np0005540741 python3.9[32663]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:04:46 np0005540741 python3.9[32815]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:04:47 np0005540741 python3.9[32965]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:04:50 np0005540741 python3.9[33218]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:04:51 np0005540741 python3.9[33368]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:04:52 np0005540741 python3.9[33522]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:04:53 np0005540741 python3.9[33680]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:04:54 np0005540741 python3.9[33764]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:05:40 np0005540741 systemd[1]: Reloading.
Dec  1 04:05:40 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:05:40 np0005540741 systemd[1]: Starting dnf makecache...
Dec  1 04:05:40 np0005540741 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec  1 04:05:40 np0005540741 dnf[33971]: Failed determining last makecache time.
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-openstack-barbican-42b4c41831408a8e323 149 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 167 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-openstack-cinder-1c00d6490d88e436f26ef 186 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-python-stevedore-c4acc5639fd2329372142 184 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-python-cloudkitty-tests-tempest-2c80f8 191 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-os-net-config-d0cedbdb788d43e5c7551df5 169 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 systemd[1]: Reloading.
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 167 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-python-designate-tests-tempest-347fdbc 196 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-openstack-glance-1fd12c29b339f30fe823e 191 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 126 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-openstack-manila-3c01b7181572c95dac462 173 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-python-whitebox-neutron-tests-tempest- 187 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-openstack-octavia-ba397f07a7331190208c 187 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-openstack-watcher-c014f81a8647287f6dcc 181 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-ansible-config_template-5ccaa22121a7ff 187 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 194 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-openstack-swift-dc98a8463506ac520c469a 195 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-python-tempestconf-8515371b7cceebd4282 177 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 dnf[33971]: delorean-openstack-heat-ui-013accbfd179753bc3f0 196 kB/s | 3.0 kB     00:00
Dec  1 04:05:40 np0005540741 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec  1 04:05:40 np0005540741 systemd[1]: Reloading.
Dec  1 04:05:41 np0005540741 dnf[33971]: CentOS Stream 9 - BaseOS                         79 kB/s | 7.3 kB     00:00
Dec  1 04:05:41 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:05:41 np0005540741 dnf[33971]: CentOS Stream 9 - AppStream                      78 kB/s | 7.4 kB     00:00
Dec  1 04:05:41 np0005540741 systemd[1]: Listening on LVM2 poll daemon socket.
Dec  1 04:05:41 np0005540741 dbus-broker-launch[757]: Noticed file-system modification, trigger reload.
Dec  1 04:05:41 np0005540741 dbus-broker-launch[757]: Noticed file-system modification, trigger reload.
Dec  1 04:05:41 np0005540741 dbus-broker-launch[757]: Noticed file-system modification, trigger reload.
Dec  1 04:05:41 np0005540741 dnf[33971]: CentOS Stream 9 - CRB                            48 kB/s | 7.2 kB     00:00
Dec  1 04:05:41 np0005540741 dnf[33971]: CentOS Stream 9 - Extras packages                72 kB/s | 8.3 kB     00:00
Dec  1 04:05:41 np0005540741 dnf[33971]: dlrn-antelope-testing                           168 kB/s | 3.0 kB     00:00
Dec  1 04:05:41 np0005540741 dnf[33971]: dlrn-antelope-build-deps                        172 kB/s | 3.0 kB     00:00
Dec  1 04:05:41 np0005540741 dnf[33971]: centos9-rabbitmq                                113 kB/s | 3.0 kB     00:00
Dec  1 04:05:41 np0005540741 dnf[33971]: centos9-storage                                 127 kB/s | 3.0 kB     00:00
Dec  1 04:05:41 np0005540741 dnf[33971]: centos9-opstools                                132 kB/s | 3.0 kB     00:00
Dec  1 04:05:41 np0005540741 dnf[33971]: NFV SIG OpenvSwitch                             142 kB/s | 3.0 kB     00:00
Dec  1 04:05:41 np0005540741 dnf[33971]: repo-setup-centos-appstream                     210 kB/s | 4.4 kB     00:00
Dec  1 04:05:41 np0005540741 dnf[33971]: repo-setup-centos-baseos                        173 kB/s | 3.9 kB     00:00
Dec  1 04:05:41 np0005540741 dnf[33971]: repo-setup-centos-highavailability              165 kB/s | 3.9 kB     00:00
Dec  1 04:05:42 np0005540741 dnf[33971]: repo-setup-centos-powertools                    183 kB/s | 4.3 kB     00:00
Dec  1 04:05:42 np0005540741 dnf[33971]: Extra Packages for Enterprise Linux 9 - x86_64  238 kB/s |  30 kB     00:00
Dec  1 04:05:42 np0005540741 dnf[33971]: Metadata cache created.
Dec  1 04:05:42 np0005540741 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec  1 04:05:42 np0005540741 systemd[1]: Finished dnf makecache.
Dec  1 04:05:42 np0005540741 systemd[1]: dnf-makecache.service: Consumed 1.762s CPU time.
Dec  1 04:06:45 np0005540741 kernel: SELinux:  Converting 2718 SID table entries...
Dec  1 04:06:45 np0005540741 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:06:45 np0005540741 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:06:45 np0005540741 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:06:45 np0005540741 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:06:45 np0005540741 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:06:45 np0005540741 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:06:45 np0005540741 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:06:46 np0005540741 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec  1 04:06:46 np0005540741 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:06:46 np0005540741 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:06:46 np0005540741 systemd[1]: Reloading.
Dec  1 04:06:46 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:06:46 np0005540741 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:06:47 np0005540741 python3.9[35339]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:06:47 np0005540741 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:06:47 np0005540741 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:06:47 np0005540741 systemd[1]: man-db-cache-update.service: Consumed 1.155s CPU time.
Dec  1 04:06:47 np0005540741 systemd[1]: run-r5f1981fab399474899468b8fa3ca5782.service: Deactivated successfully.
Dec  1 04:06:49 np0005540741 python3.9[35621]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec  1 04:06:50 np0005540741 python3.9[35773]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec  1 04:06:55 np0005540741 python3.9[35926]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:06:56 np0005540741 python3.9[36078]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec  1 04:06:59 np0005540741 python3.9[36231]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:07:02 np0005540741 python3.9[36383]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:07:03 np0005540741 python3.9[36506]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580019.8627565-236-260708985041698/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=30f43b3b193e6fb640de0bf588bd9062982dce0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:07:04 np0005540741 python3.9[36658]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:07:04 np0005540741 python3.9[36810]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:07:05 np0005540741 python3.9[36963]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:07:06 np0005540741 python3.9[37115]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec  1 04:07:06 np0005540741 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:07:07 np0005540741 python3.9[37269]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 04:07:08 np0005540741 python3.9[37427]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  1 04:07:09 np0005540741 python3.9[37587]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec  1 04:07:10 np0005540741 python3.9[37740]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 04:07:10 np0005540741 python3.9[37898]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec  1 04:07:11 np0005540741 python3.9[38050]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:07:14 np0005540741 python3.9[38203]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:07:14 np0005540741 python3.9[38355]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:07:15 np0005540741 python3.9[38478]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764580034.1952848-355-214382523928274/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:07:16 np0005540741 python3.9[38630]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:07:16 np0005540741 systemd[1]: Starting Load Kernel Modules...
Dec  1 04:07:16 np0005540741 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec  1 04:07:16 np0005540741 kernel: Bridge firewalling registered
Dec  1 04:07:16 np0005540741 systemd-modules-load[38634]: Inserted module 'br_netfilter'
Dec  1 04:07:16 np0005540741 systemd[1]: Finished Load Kernel Modules.
Dec  1 04:07:16 np0005540741 python3.9[38789]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:07:17 np0005540741 python3.9[38912]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764580036.5182004-378-59291590312498/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:07:18 np0005540741 python3.9[39064]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:07:21 np0005540741 dbus-broker-launch[757]: Noticed file-system modification, trigger reload.
Dec  1 04:07:21 np0005540741 dbus-broker-launch[757]: Noticed file-system modification, trigger reload.
Dec  1 04:07:22 np0005540741 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:07:22 np0005540741 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:07:22 np0005540741 systemd[1]: Reloading.
Dec  1 04:07:22 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:07:22 np0005540741 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:07:24 np0005540741 python3.9[41104]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:07:24 np0005540741 python3.9[42101]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec  1 04:07:25 np0005540741 python3.9[42771]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:07:26 np0005540741 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:07:26 np0005540741 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:07:26 np0005540741 systemd[1]: man-db-cache-update.service: Consumed 4.733s CPU time.
Dec  1 04:07:26 np0005540741 systemd[1]: run-r9f23171367874ed690c57a11302e9c11.service: Deactivated successfully.
Dec  1 04:07:26 np0005540741 python3.9[43223]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:07:26 np0005540741 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  1 04:07:26 np0005540741 systemd[1]: Starting Authorization Manager...
Dec  1 04:07:26 np0005540741 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  1 04:07:26 np0005540741 polkitd[43441]: Started polkitd version 0.117
Dec  1 04:07:26 np0005540741 systemd[1]: Started Authorization Manager.
Dec  1 04:07:27 np0005540741 python3.9[43611]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:07:27 np0005540741 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec  1 04:07:27 np0005540741 systemd[1]: tuned.service: Deactivated successfully.
Dec  1 04:07:27 np0005540741 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec  1 04:07:27 np0005540741 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  1 04:07:28 np0005540741 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  1 04:07:28 np0005540741 python3.9[43773]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec  1 04:07:30 np0005540741 python3.9[43925]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:07:31 np0005540741 systemd[1]: Reloading.
Dec  1 04:07:31 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:07:32 np0005540741 python3.9[44114]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:07:32 np0005540741 systemd[1]: Reloading.
Dec  1 04:07:32 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:07:33 np0005540741 python3.9[44302]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:07:34 np0005540741 python3.9[44455]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:07:34 np0005540741 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec  1 04:07:35 np0005540741 python3.9[44608]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:07:37 np0005540741 python3.9[44770]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:07:38 np0005540741 python3.9[44923]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:07:38 np0005540741 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  1 04:07:38 np0005540741 systemd[1]: Stopped Apply Kernel Variables.
Dec  1 04:07:38 np0005540741 systemd[1]: Stopping Apply Kernel Variables...
Dec  1 04:07:38 np0005540741 systemd[1]: Starting Apply Kernel Variables...
Dec  1 04:07:38 np0005540741 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  1 04:07:38 np0005540741 systemd[1]: Finished Apply Kernel Variables.
Dec  1 04:07:38 np0005540741 systemd[1]: session-9.scope: Deactivated successfully.
Dec  1 04:07:38 np0005540741 systemd[1]: session-9.scope: Consumed 2min 11.559s CPU time.
Dec  1 04:07:38 np0005540741 systemd-logind[788]: Session 9 logged out. Waiting for processes to exit.
Dec  1 04:07:38 np0005540741 systemd-logind[788]: Removed session 9.
Dec  1 04:07:45 np0005540741 systemd-logind[788]: New session 10 of user zuul.
Dec  1 04:07:45 np0005540741 systemd[1]: Started Session 10 of User zuul.
Dec  1 04:07:46 np0005540741 python3.9[45106]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:07:47 np0005540741 python3.9[45262]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec  1 04:07:48 np0005540741 python3.9[45415]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 04:07:49 np0005540741 python3.9[45573]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  1 04:07:50 np0005540741 python3.9[45733]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:07:50 np0005540741 python3.9[45817]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  1 04:07:53 np0005540741 python3.9[45983]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:08:04 np0005540741 kernel: SELinux:  Converting 2730 SID table entries...
Dec  1 04:08:04 np0005540741 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:08:04 np0005540741 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:08:04 np0005540741 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:08:04 np0005540741 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:08:04 np0005540741 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:08:04 np0005540741 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:08:04 np0005540741 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:08:05 np0005540741 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec  1 04:08:05 np0005540741 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec  1 04:08:06 np0005540741 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:08:06 np0005540741 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:08:06 np0005540741 systemd[1]: Reloading.
Dec  1 04:08:06 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:08:06 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:08:06 np0005540741 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:08:06 np0005540741 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:08:06 np0005540741 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:08:06 np0005540741 systemd[1]: run-rbfc7a711b29743839570d8236289c43f.service: Deactivated successfully.
Dec  1 04:08:07 np0005540741 python3.9[47080]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 04:08:08 np0005540741 systemd[1]: Reloading.
Dec  1 04:08:08 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:08:08 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:08:08 np0005540741 systemd[1]: Starting Open vSwitch Database Unit...
Dec  1 04:08:08 np0005540741 chown[47122]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec  1 04:08:08 np0005540741 ovs-ctl[47127]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec  1 04:08:08 np0005540741 ovs-ctl[47127]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec  1 04:08:08 np0005540741 ovs-ctl[47127]: Starting ovsdb-server [  OK  ]
Dec  1 04:08:08 np0005540741 ovs-vsctl[47176]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec  1 04:08:08 np0005540741 ovs-vsctl[47196]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"a8013a17-6378-4c2f-a5de-9d3b29c7a42e\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec  1 04:08:08 np0005540741 ovs-ctl[47127]: Configuring Open vSwitch system IDs [  OK  ]
Dec  1 04:08:08 np0005540741 ovs-ctl[47127]: Enabling remote OVSDB managers [  OK  ]
Dec  1 04:08:08 np0005540741 systemd[1]: Started Open vSwitch Database Unit.
Dec  1 04:08:08 np0005540741 ovs-vsctl[47202]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec  1 04:08:08 np0005540741 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec  1 04:08:08 np0005540741 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec  1 04:08:08 np0005540741 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec  1 04:08:08 np0005540741 kernel: openvswitch: Open vSwitch switching datapath
Dec  1 04:08:08 np0005540741 ovs-ctl[47246]: Inserting openvswitch module [  OK  ]
Dec  1 04:08:08 np0005540741 ovs-ctl[47215]: Starting ovs-vswitchd [  OK  ]
Dec  1 04:08:08 np0005540741 ovs-vsctl[47263]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec  1 04:08:08 np0005540741 ovs-ctl[47215]: Enabling remote OVSDB managers [  OK  ]
Dec  1 04:08:08 np0005540741 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec  1 04:08:08 np0005540741 systemd[1]: Starting Open vSwitch...
Dec  1 04:08:08 np0005540741 systemd[1]: Finished Open vSwitch.
Dec  1 04:08:09 np0005540741 python3.9[47415]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:08:10 np0005540741 python3.9[47567]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec  1 04:08:11 np0005540741 kernel: SELinux:  Converting 2744 SID table entries...
Dec  1 04:08:11 np0005540741 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:08:11 np0005540741 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:08:11 np0005540741 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:08:11 np0005540741 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:08:11 np0005540741 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:08:11 np0005540741 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:08:11 np0005540741 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:08:12 np0005540741 python3.9[47722]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:08:13 np0005540741 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec  1 04:08:13 np0005540741 python3.9[47880]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:08:15 np0005540741 python3.9[48033]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:08:17 np0005540741 python3.9[48320]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  1 04:08:17 np0005540741 python3.9[48470]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:08:18 np0005540741 python3.9[48624]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:08:20 np0005540741 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:08:20 np0005540741 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:08:20 np0005540741 systemd[1]: Reloading.
Dec  1 04:08:20 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:08:20 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:08:20 np0005540741 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:08:21 np0005540741 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:08:21 np0005540741 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:08:21 np0005540741 systemd[1]: run-r5f25536f56ca45d8a59d64e2c8369290.service: Deactivated successfully.
Dec  1 04:08:21 np0005540741 python3.9[48941]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:08:22 np0005540741 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  1 04:08:22 np0005540741 systemd[1]: Stopped Network Manager Wait Online.
Dec  1 04:08:22 np0005540741 systemd[1]: Stopping Network Manager Wait Online...
Dec  1 04:08:22 np0005540741 systemd[1]: Stopping Network Manager...
Dec  1 04:08:22 np0005540741 NetworkManager[7186]: <info>  [1764580102.0122] caught SIGTERM, shutting down normally.
Dec  1 04:08:22 np0005540741 NetworkManager[7186]: <info>  [1764580102.0139] dhcp4 (eth0): canceled DHCP transaction
Dec  1 04:08:22 np0005540741 NetworkManager[7186]: <info>  [1764580102.0139] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:08:22 np0005540741 NetworkManager[7186]: <info>  [1764580102.0139] dhcp4 (eth0): state changed no lease
Dec  1 04:08:22 np0005540741 NetworkManager[7186]: <info>  [1764580102.0142] manager: NetworkManager state is now CONNECTED_SITE
Dec  1 04:08:22 np0005540741 NetworkManager[7186]: <info>  [1764580102.0226] exiting (success)
Dec  1 04:08:22 np0005540741 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 04:08:22 np0005540741 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 04:08:22 np0005540741 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  1 04:08:22 np0005540741 systemd[1]: Stopped Network Manager.
Dec  1 04:08:22 np0005540741 systemd[1]: NetworkManager.service: Consumed 11.459s CPU time, 4.1M memory peak, read 0B from disk, written 21.5K to disk.
Dec  1 04:08:22 np0005540741 systemd[1]: Starting Network Manager...
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.1134] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:fbf967f0-219c-4ceb-b589-3e4f3756d2b4)
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.1136] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.1185] manager[0x55bcd234d090]: monitoring kernel firmware directory '/lib/firmware'.
Dec  1 04:08:22 np0005540741 systemd[1]: Starting Hostname Service...
Dec  1 04:08:22 np0005540741 systemd[1]: Started Hostname Service.
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2230] hostname: hostname: using hostnamed
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2231] hostname: static hostname changed from (none) to "compute-0"
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2238] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2244] manager[0x55bcd234d090]: rfkill: Wi-Fi hardware radio set enabled
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2244] manager[0x55bcd234d090]: rfkill: WWAN hardware radio set enabled
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2276] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2289] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2289] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2290] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2291] manager: Networking is enabled by state file
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2293] settings: Loaded settings plugin: keyfile (internal)
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2297] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2326] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2337] dhcp: init: Using DHCP client 'internal'
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2340] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2346] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2352] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2361] device (lo): Activation: starting connection 'lo' (d32b959f-25b9-49e2-b1c9-8c743b9b7f56)
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2368] device (eth0): carrier: link connected
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2373] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2379] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2379] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2387] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2394] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2400] device (eth1): carrier: link connected
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2405] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2410] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (178f6c72-5a8d-5d35-b5e8-6b22bb1c98c8) (indicated)
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2411] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2417] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2424] device (eth1): Activation: starting connection 'ci-private-network' (178f6c72-5a8d-5d35-b5e8-6b22bb1c98c8)
Dec  1 04:08:22 np0005540741 systemd[1]: Started Network Manager.
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2435] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2442] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2445] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2447] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2449] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2452] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2454] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2457] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2461] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2479] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2482] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2488] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2500] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2507] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2509] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2512] device (lo): Activation: successful, device activated.
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2519] dhcp4 (eth0): state changed new lease, address=38.102.83.132
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2526] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  1 04:08:22 np0005540741 systemd[1]: Starting Network Manager Wait Online...
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2739] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2791] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2807] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2824] manager: NetworkManager state is now CONNECTED_LOCAL
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2840] device (eth1): Activation: successful, device activated.
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2902] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2908] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2919] manager: NetworkManager state is now CONNECTED_SITE
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2932] device (eth0): Activation: successful, device activated.
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2942] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  1 04:08:22 np0005540741 NetworkManager[48954]: <info>  [1764580102.2952] manager: startup complete
Dec  1 04:08:22 np0005540741 systemd[1]: Finished Network Manager Wait Online.
Dec  1 04:08:22 np0005540741 python3.9[49168]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:08:27 np0005540741 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:08:27 np0005540741 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:08:27 np0005540741 systemd[1]: Reloading.
Dec  1 04:08:27 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:08:27 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:08:27 np0005540741 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:08:28 np0005540741 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:08:28 np0005540741 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:08:28 np0005540741 systemd[1]: run-rd9a7ddd98084426fa88375df5b2bc6b4.service: Deactivated successfully.
Dec  1 04:08:29 np0005540741 python3.9[49627]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:08:30 np0005540741 python3.9[49779]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:08:30 np0005540741 python3.9[49933]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:08:31 np0005540741 python3.9[50085]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:08:32 np0005540741 python3.9[50237]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:08:32 np0005540741 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 04:08:32 np0005540741 python3.9[50389]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:08:33 np0005540741 python3.9[50541]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:08:34 np0005540741 python3.9[50664]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580113.0771518-229-118631452750966/.source _original_basename=.7bqdm2kk follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:08:34 np0005540741 python3.9[50816]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:08:35 np0005540741 python3.9[50968]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec  1 04:08:36 np0005540741 python3.9[51120]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:08:38 np0005540741 python3.9[51547]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec  1 04:08:39 np0005540741 ansible-async_wrapper.py[51722]: Invoked with j688281423766 300 /home/zuul/.ansible/tmp/ansible-tmp-1764580118.5534203-295-127613937801440/AnsiballZ_edpm_os_net_config.py _
Dec  1 04:08:39 np0005540741 ansible-async_wrapper.py[51725]: Starting module and watcher
Dec  1 04:08:39 np0005540741 ansible-async_wrapper.py[51725]: Start watching 51726 (300)
Dec  1 04:08:39 np0005540741 ansible-async_wrapper.py[51726]: Start module (51726)
Dec  1 04:08:39 np0005540741 ansible-async_wrapper.py[51722]: Return async_wrapper task started.
Dec  1 04:08:39 np0005540741 python3.9[51727]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec  1 04:08:40 np0005540741 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec  1 04:08:40 np0005540741 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec  1 04:08:40 np0005540741 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec  1 04:08:40 np0005540741 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec  1 04:08:40 np0005540741 kernel: cfg80211: failed to load regulatory.db
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.4708] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.4722] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5144] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5145] audit: op="connection-add" uuid="6ff9277f-1405-47a9-9bd1-aea2fd3b8890" name="br-ex-br" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5157] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5158] audit: op="connection-add" uuid="b7d75a34-a156-4703-acf7-0960e9a4a3a8" name="br-ex-port" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5172] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5174] audit: op="connection-add" uuid="8ba334ab-5c46-407d-84e6-a38a95437ad2" name="eth1-port" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5185] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5187] audit: op="connection-add" uuid="8ad1b859-1982-4519-beea-586c02c240f1" name="vlan20-port" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5198] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5200] audit: op="connection-add" uuid="6415e037-1493-4144-b9b8-d12d21267004" name="vlan21-port" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5209] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5210] audit: op="connection-add" uuid="ce313015-bf7a-4e58-b576-c92d665db680" name="vlan22-port" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5220] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5221] audit: op="connection-add" uuid="f70ec55e-8747-47cb-9d67-31105dd4392b" name="vlan23-port" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5239] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode,connection.autoconnect-priority,connection.timestamp" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5253] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5254] audit: op="connection-add" uuid="64a49a94-4c59-4bd3-bee9-d0d6482501e6" name="br-ex-if" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5293] audit: op="connection-update" uuid="178f6c72-5a8d-5d35-b5e8-6b22bb1c98c8" name="ci-private-network" args="ipv4.never-default,ipv4.dns,ipv4.method,ipv4.addresses,ipv4.routes,ipv4.routing-rules,ipv6.dns,ipv6.method,ipv6.addresses,ipv6.addr-gen-mode,ipv6.routing-rules,ipv6.routes,ovs-interface.type,ovs-external-ids.data,connection.port-type,connection.slave-type,connection.controller,connection.master,connection.timestamp" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5308] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5309] audit: op="connection-add" uuid="3f81cbde-6e8a-48c2-917a-ed56ef6b1b23" name="vlan20-if" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5322] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5323] audit: op="connection-add" uuid="099afefb-306c-4b88-8220-cc7739999342" name="vlan21-if" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5336] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5337] audit: op="connection-add" uuid="e2eeff41-c3c2-496d-8f47-3776c5e2de71" name="vlan22-if" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5351] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5352] audit: op="connection-add" uuid="b3a395d1-9bfe-4345-8be8-1087ccd4ef3f" name="vlan23-if" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5362] audit: op="connection-delete" uuid="c6b7d4f6-4237-35c7-90cb-622f3da1d185" name="Wired connection 1" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5370] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5378] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5381] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (6ff9277f-1405-47a9-9bd1-aea2fd3b8890)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5381] audit: op="connection-activate" uuid="6ff9277f-1405-47a9-9bd1-aea2fd3b8890" name="br-ex-br" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5382] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5387] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5389] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (b7d75a34-a156-4703-acf7-0960e9a4a3a8)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5391] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5395] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5398] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (8ba334ab-5c46-407d-84e6-a38a95437ad2)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5399] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5404] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5406] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (8ad1b859-1982-4519-beea-586c02c240f1)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5407] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5412] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5415] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (6415e037-1493-4144-b9b8-d12d21267004)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5416] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5420] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5423] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (ce313015-bf7a-4e58-b576-c92d665db680)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5425] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5429] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5432] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (f70ec55e-8747-47cb-9d67-31105dd4392b)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5432] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5434] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5435] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5439] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5442] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5445] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (64a49a94-4c59-4bd3-bee9-d0d6482501e6)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5446] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5448] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5449] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5449] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5450] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5459] device (eth1): disconnecting for new activation request.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5460] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5462] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5463] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5464] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5466] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5469] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5473] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (3f81cbde-6e8a-48c2-917a-ed56ef6b1b23)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5473] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5475] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5476] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5477] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5479] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5482] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5484] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (099afefb-306c-4b88-8220-cc7739999342)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5485] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5487] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5488] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5489] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5490] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5493] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5496] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (e2eeff41-c3c2-496d-8f47-3776c5e2de71)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5496] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5498] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5499] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5500] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5502] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5507] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5509] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (b3a395d1-9bfe-4345-8be8-1087ccd4ef3f)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5510] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5512] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5514] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5514] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5515] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5526] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,ipv6.method,ipv6.addr-gen-mode,connection.autoconnect-priority" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5528] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5530] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5531] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5536] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5538] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5540] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5546] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5548] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 kernel: ovs-system: entered promiscuous mode
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5554] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5560] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5566] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5567] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 systemd-udevd[51732]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:08:41 np0005540741 kernel: Timeout policy base is empty
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5571] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5576] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5580] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5582] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5587] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5591] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5594] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5596] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5600] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5604] dhcp4 (eth0): canceled DHCP transaction
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5604] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5604] dhcp4 (eth0): state changed no lease
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5605] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5614] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5617] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51728 uid=0 result="fail" reason="Device is not activated"
Dec  1 04:08:41 np0005540741 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5654] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5657] dhcp4 (eth0): state changed new lease, address=38.102.83.132
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5661] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5678] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec  1 04:08:41 np0005540741 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5725] device (eth1): disconnecting for new activation request.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5726] audit: op="connection-activate" uuid="178f6c72-5a8d-5d35-b5e8-6b22bb1c98c8" name="ci-private-network" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5727] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5732] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec  1 04:08:41 np0005540741 kernel: br-ex: entered promiscuous mode
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5850] device (eth1): Activation: starting connection 'ci-private-network' (178f6c72-5a8d-5d35-b5e8-6b22bb1c98c8)
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5881] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5887] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5899] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5901] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51728 uid=0 result="success"
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5901] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5903] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5905] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5909] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5912] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5914] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5918] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5932] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 kernel: vlan22: entered promiscuous mode
Dec  1 04:08:41 np0005540741 systemd-udevd[51733]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5938] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5944] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5952] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5958] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5964] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5969] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5974] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5980] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5988] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.5994] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6000] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6004] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6010] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6019] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec  1 04:08:41 np0005540741 kernel: vlan21: entered promiscuous mode
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6027] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6056] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6060] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6071] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 kernel: vlan23: entered promiscuous mode
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6084] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6090] device (eth1): Activation: successful, device activated.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6132] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6139] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 kernel: vlan20: entered promiscuous mode
Dec  1 04:08:41 np0005540741 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6158] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec  1 04:08:41 np0005540741 systemd-udevd[51839]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6170] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6183] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6207] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6214] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6216] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6219] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6225] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6230] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6235] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6243] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6245] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6248] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6255] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6260] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6266] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6279] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6293] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6336] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6339] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:08:41 np0005540741 NetworkManager[48954]: <info>  [1764580121.6346] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  1 04:08:42 np0005540741 NetworkManager[48954]: <info>  [1764580122.7561] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51728 uid=0 result="success"
Dec  1 04:08:42 np0005540741 NetworkManager[48954]: <info>  [1764580122.9494] checkpoint[0x55bcd2323950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec  1 04:08:42 np0005540741 NetworkManager[48954]: <info>  [1764580122.9497] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51728 uid=0 result="success"
Dec  1 04:08:43 np0005540741 NetworkManager[48954]: <info>  [1764580123.3528] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51728 uid=0 result="success"
Dec  1 04:08:43 np0005540741 NetworkManager[48954]: <info>  [1764580123.3542] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51728 uid=0 result="success"
Dec  1 04:08:43 np0005540741 python3.9[52088]: ansible-ansible.legacy.async_status Invoked with jid=j688281423766.51722 mode=status _async_dir=/root/.ansible_async
Dec  1 04:08:43 np0005540741 NetworkManager[48954]: <info>  [1764580123.5862] audit: op="networking-control" arg="global-dns-configuration" pid=51728 uid=0 result="success"
Dec  1 04:08:43 np0005540741 NetworkManager[48954]: <info>  [1764580123.5887] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec  1 04:08:43 np0005540741 NetworkManager[48954]: <info>  [1764580123.5909] audit: op="networking-control" arg="global-dns-configuration" pid=51728 uid=0 result="success"
Dec  1 04:08:43 np0005540741 NetworkManager[48954]: <info>  [1764580123.5929] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51728 uid=0 result="success"
Dec  1 04:08:43 np0005540741 NetworkManager[48954]: <info>  [1764580123.7388] checkpoint[0x55bcd2323a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec  1 04:08:43 np0005540741 NetworkManager[48954]: <info>  [1764580123.7392] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51728 uid=0 result="success"
Dec  1 04:08:43 np0005540741 ansible-async_wrapper.py[51726]: Module complete (51726)
Dec  1 04:08:44 np0005540741 ansible-async_wrapper.py[51725]: Done in kid B.
Dec  1 04:08:46 np0005540741 python3.9[52193]: ansible-ansible.legacy.async_status Invoked with jid=j688281423766.51722 mode=status _async_dir=/root/.ansible_async
Dec  1 04:08:47 np0005540741 python3.9[52292]: ansible-ansible.legacy.async_status Invoked with jid=j688281423766.51722 mode=cleanup _async_dir=/root/.ansible_async
Dec  1 04:08:47 np0005540741 python3.9[52444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:08:48 np0005540741 python3.9[52567]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580127.5453684-322-114557561447449/.source.returncode _original_basename=.ejppru9l follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:08:49 np0005540741 python3.9[52719]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:08:49 np0005540741 python3.9[52842]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580128.7046254-338-33903186203364/.source.cfg _original_basename=.jf5y7acg follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:08:50 np0005540741 python3.9[52995]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:08:50 np0005540741 systemd[1]: Reloading Network Manager...
Dec  1 04:08:50 np0005540741 NetworkManager[48954]: <info>  [1764580130.3978] audit: op="reload" arg="0" pid=52999 uid=0 result="success"
Dec  1 04:08:50 np0005540741 NetworkManager[48954]: <info>  [1764580130.3985] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec  1 04:08:50 np0005540741 systemd[1]: Reloaded Network Manager.
Dec  1 04:08:50 np0005540741 systemd[1]: session-10.scope: Deactivated successfully.
Dec  1 04:08:50 np0005540741 systemd[1]: session-10.scope: Consumed 47.040s CPU time.
Dec  1 04:08:50 np0005540741 systemd-logind[788]: Session 10 logged out. Waiting for processes to exit.
Dec  1 04:08:50 np0005540741 systemd-logind[788]: Removed session 10.
Dec  1 04:08:52 np0005540741 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  1 04:08:56 np0005540741 systemd-logind[788]: New session 11 of user zuul.
Dec  1 04:08:56 np0005540741 systemd[1]: Started Session 11 of User zuul.
Dec  1 04:08:57 np0005540741 python3.9[53185]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:08:58 np0005540741 python3.9[53339]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:08:59 np0005540741 python3.9[53533]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:09:00 np0005540741 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 04:09:00 np0005540741 systemd[1]: session-11.scope: Deactivated successfully.
Dec  1 04:09:00 np0005540741 systemd[1]: session-11.scope: Consumed 2.233s CPU time.
Dec  1 04:09:00 np0005540741 systemd-logind[788]: Session 11 logged out. Waiting for processes to exit.
Dec  1 04:09:00 np0005540741 systemd-logind[788]: Removed session 11.
Dec  1 04:09:05 np0005540741 systemd-logind[788]: New session 12 of user zuul.
Dec  1 04:09:05 np0005540741 systemd[1]: Started Session 12 of User zuul.
Dec  1 04:09:06 np0005540741 python3.9[53716]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:09:07 np0005540741 python3.9[53870]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:09:08 np0005540741 python3.9[54026]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:09:09 np0005540741 python3.9[54110]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:09:11 np0005540741 python3.9[54264]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:09:12 np0005540741 python3.9[54459]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:13 np0005540741 python3.9[54612]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:09:13 np0005540741 podman[54613]: 2025-12-01 09:09:13.167727101 +0000 UTC m=+0.042176780 system refresh
Dec  1 04:09:13 np0005540741 python3.9[54776]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:09:14 np0005540741 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:09:14 np0005540741 python3.9[54899]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580153.327181-79-207350350659281/.source.json follow=False _original_basename=podman_network_config.j2 checksum=b092e8926a42991481e8661bbb2548b1c09df469 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:15 np0005540741 python3.9[55051]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:09:15 np0005540741 python3.9[55174]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764580154.8477156-94-166247171625002/.source.conf follow=False _original_basename=registries.conf.j2 checksum=f95551851a3aad1fadf39ba40ad5808b10502fe1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:09:16 np0005540741 python3.9[55326]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:09:17 np0005540741 python3.9[55478]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:09:17 np0005540741 python3.9[55630]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:09:18 np0005540741 python3.9[55782]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:09:19 np0005540741 python3.9[55934]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:09:21 np0005540741 python3.9[56087]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:09:22 np0005540741 python3.9[56241]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:09:22 np0005540741 python3.9[56393]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:09:23 np0005540741 python3.9[56545]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:09:24 np0005540741 python3.9[56698]: ansible-service_facts Invoked
Dec  1 04:09:24 np0005540741 network[56715]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:09:24 np0005540741 network[56716]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:09:24 np0005540741 network[56717]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:09:28 np0005540741 python3.9[57169]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:09:31 np0005540741 python3.9[57322]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec  1 04:09:32 np0005540741 python3.9[57474]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:09:33 np0005540741 python3.9[57599]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580172.1676557-238-90243860553674/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:33 np0005540741 python3.9[57753]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:09:34 np0005540741 python3.9[57878]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580173.536461-253-134057119867782/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:35 np0005540741 python3.9[58032]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:36 np0005540741 python3.9[58186]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:09:37 np0005540741 python3.9[58270]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:09:39 np0005540741 python3.9[58424]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:09:39 np0005540741 python3.9[58508]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:09:39 np0005540741 chronyd[791]: chronyd exiting
Dec  1 04:09:39 np0005540741 systemd[1]: Stopping NTP client/server...
Dec  1 04:09:39 np0005540741 systemd[1]: chronyd.service: Deactivated successfully.
Dec  1 04:09:39 np0005540741 systemd[1]: Stopped NTP client/server.
Dec  1 04:09:39 np0005540741 systemd[1]: Starting NTP client/server...
Dec  1 04:09:39 np0005540741 chronyd[58516]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  1 04:09:39 np0005540741 chronyd[58516]: Frequency -28.434 +/- 0.269 ppm read from /var/lib/chrony/drift
Dec  1 04:09:39 np0005540741 chronyd[58516]: Loaded seccomp filter (level 2)
Dec  1 04:09:39 np0005540741 systemd[1]: Started NTP client/server.
Dec  1 04:09:40 np0005540741 systemd[1]: session-12.scope: Deactivated successfully.
Dec  1 04:09:40 np0005540741 systemd[1]: session-12.scope: Consumed 24.157s CPU time.
Dec  1 04:09:40 np0005540741 systemd-logind[788]: Session 12 logged out. Waiting for processes to exit.
Dec  1 04:09:40 np0005540741 systemd-logind[788]: Removed session 12.
Dec  1 04:09:46 np0005540741 systemd-logind[788]: New session 13 of user zuul.
Dec  1 04:09:46 np0005540741 systemd[1]: Started Session 13 of User zuul.
Dec  1 04:09:46 np0005540741 python3.9[58697]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:47 np0005540741 python3.9[58849]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:09:48 np0005540741 python3.9[58972]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580187.228003-34-274994467968404/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:48 np0005540741 systemd[1]: session-13.scope: Deactivated successfully.
Dec  1 04:09:48 np0005540741 systemd[1]: session-13.scope: Consumed 1.402s CPU time.
Dec  1 04:09:48 np0005540741 systemd-logind[788]: Session 13 logged out. Waiting for processes to exit.
Dec  1 04:09:48 np0005540741 systemd-logind[788]: Removed session 13.
Dec  1 04:09:54 np0005540741 systemd-logind[788]: New session 14 of user zuul.
Dec  1 04:09:54 np0005540741 systemd[1]: Started Session 14 of User zuul.
Dec  1 04:09:55 np0005540741 python3.9[59150]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:09:56 np0005540741 python3.9[59306]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:57 np0005540741 python3.9[59481]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:09:58 np0005540741 python3.9[59604]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764580197.0280848-41-105053849947433/.source.json _original_basename=.lxm6qfx3 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:59 np0005540741 python3.9[59756]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:09:59 np0005540741 python3.9[59879]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580198.6725807-64-24533966855836/.source _original_basename=.8r9lzn_l follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:00 np0005540741 python3.9[60031]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:10:00 np0005540741 python3.9[60183]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:01 np0005540741 python3.9[60306]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764580200.3390658-88-258182885914952/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:10:01 np0005540741 python3.9[60458]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:02 np0005540741 python3.9[60581]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764580201.429198-88-118041360139777/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:10:02 np0005540741 python3.9[60733]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:03 np0005540741 python3.9[60885]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:04 np0005540741 python3.9[61008]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580203.1297405-125-161627068528125/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:04 np0005540741 python3.9[61160]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:05 np0005540741 python3.9[61283]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580204.268697-140-234548500461708/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:06 np0005540741 python3.9[61435]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:10:06 np0005540741 systemd[1]: Reloading.
Dec  1 04:10:06 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:10:06 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:10:06 np0005540741 systemd[1]: Reloading.
Dec  1 04:10:06 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:10:06 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:10:06 np0005540741 systemd[1]: Starting EDPM Container Shutdown...
Dec  1 04:10:06 np0005540741 systemd[1]: Finished EDPM Container Shutdown.
Dec  1 04:10:07 np0005540741 python3.9[61662]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:08 np0005540741 python3.9[61785]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580207.1460686-163-213283092680509/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:08 np0005540741 python3.9[61937]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:09 np0005540741 python3.9[62060]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580208.2299519-178-61878134684921/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:09 np0005540741 python3.9[62212]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:10:09 np0005540741 systemd[1]: Reloading.
Dec  1 04:10:09 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:10:09 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:10:10 np0005540741 systemd[1]: Reloading.
Dec  1 04:10:10 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:10:10 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:10:10 np0005540741 systemd[1]: Starting Create netns directory...
Dec  1 04:10:10 np0005540741 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  1 04:10:10 np0005540741 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  1 04:10:10 np0005540741 systemd[1]: Finished Create netns directory.
Dec  1 04:10:11 np0005540741 python3.9[62439]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:10:11 np0005540741 network[62456]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:10:11 np0005540741 network[62457]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:10:11 np0005540741 network[62458]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:10:15 np0005540741 python3.9[62720]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:10:15 np0005540741 systemd[1]: Reloading.
Dec  1 04:10:15 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:10:15 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:10:15 np0005540741 systemd[1]: Stopping IPv4 firewall with iptables...
Dec  1 04:10:15 np0005540741 iptables.init[62761]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec  1 04:10:15 np0005540741 iptables.init[62761]: iptables: Flushing firewall rules: [  OK  ]
Dec  1 04:10:15 np0005540741 systemd[1]: iptables.service: Deactivated successfully.
Dec  1 04:10:15 np0005540741 systemd[1]: Stopped IPv4 firewall with iptables.
Dec  1 04:10:16 np0005540741 python3.9[62957]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:10:17 np0005540741 python3.9[63111]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:10:17 np0005540741 systemd[1]: Reloading.
Dec  1 04:10:17 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:10:17 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:10:17 np0005540741 systemd[1]: Starting Netfilter Tables...
Dec  1 04:10:17 np0005540741 systemd[1]: Finished Netfilter Tables.
Dec  1 04:10:18 np0005540741 python3.9[63303]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:10:19 np0005540741 python3.9[63456]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:19 np0005540741 python3.9[63581]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580218.9172382-247-109808846649777/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:20 np0005540741 python3.9[63734]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:10:20 np0005540741 systemd[1]: Reloading OpenSSH server daemon...
Dec  1 04:10:20 np0005540741 systemd[1]: Reloaded OpenSSH server daemon.
Dec  1 04:10:21 np0005540741 python3.9[63890]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:22 np0005540741 python3.9[64042]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:22 np0005540741 python3.9[64165]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580221.5954757-278-265994370727437/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:23 np0005540741 python3.9[64317]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  1 04:10:23 np0005540741 systemd[1]: Starting Time & Date Service...
Dec  1 04:10:23 np0005540741 systemd[1]: Started Time & Date Service.
Dec  1 04:10:24 np0005540741 python3.9[64473]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:25 np0005540741 python3.9[64625]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:25 np0005540741 python3.9[64748]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580224.6005318-313-169543447719693/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:26 np0005540741 python3.9[64900]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:26 np0005540741 python3.9[65023]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580225.8162234-328-265101831653663/.source.yaml _original_basename=.12qrcw_2 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:27 np0005540741 python3.9[65175]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:27 np0005540741 python3.9[65298]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580227.00589-343-144813562374855/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:28 np0005540741 python3.9[65450]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:10:29 np0005540741 python3.9[65603]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:10:30 np0005540741 python3[65756]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  1 04:10:31 np0005540741 python3.9[65908]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:31 np0005540741 python3.9[66031]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580230.560174-382-238634775695896/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:32 np0005540741 python3.9[66183]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:32 np0005540741 python3.9[66306]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580231.7545607-397-158485207755866/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:33 np0005540741 python3.9[66458]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:33 np0005540741 python3.9[66581]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580232.8348715-412-119528364752335/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:34 np0005540741 python3.9[66733]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:35 np0005540741 python3.9[66856]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580233.9697518-427-182805217374537/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:35 np0005540741 python3.9[67008]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:10:36 np0005540741 python3.9[67131]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580235.2278483-442-242492941712870/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:36 np0005540741 python3.9[67283]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:37 np0005540741 python3.9[67435]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:10:38 np0005540741 python3.9[67594]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:39 np0005540741 python3.9[67747]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:40 np0005540741 python3.9[67899]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:40 np0005540741 python3.9[68051]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  1 04:10:41 np0005540741 python3.9[68204]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  1 04:10:42 np0005540741 systemd[1]: session-14.scope: Deactivated successfully.
Dec  1 04:10:42 np0005540741 systemd[1]: session-14.scope: Consumed 31.997s CPU time.
Dec  1 04:10:42 np0005540741 systemd-logind[788]: Session 14 logged out. Waiting for processes to exit.
Dec  1 04:10:42 np0005540741 systemd-logind[788]: Removed session 14.
Dec  1 04:10:47 np0005540741 systemd-logind[788]: New session 15 of user zuul.
Dec  1 04:10:47 np0005540741 systemd[1]: Started Session 15 of User zuul.
Dec  1 04:10:48 np0005540741 python3.9[68385]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec  1 04:10:49 np0005540741 python3.9[68537]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:10:50 np0005540741 python3.9[68689]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:10:51 np0005540741 python3.9[68841]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDRTxmAPcz2eFUCrQOAknLp4ibCvALuiJ7iA+ICPT8Mpd8XYcXDdZBZjlSgWd0U+d6qvFNYaJ4Kq/cNnxeSVMCkpQCGri3TTRfaS9L5COiCf0cmBNheHZSQL0uZLjKzjeaIyGWH6HdOA7KUsCK2YT/Iyf0OJzrBs5vhWuzbSXsCjsHTSzR+XxRX3C/ImHAtccLwxysUhm6H4CGIPn0bY/YGgoRkJUvouHT/4kSxhQrtFAKJOWlJ01d3tdISKrGa+SiKU6zq4yCgT5yeSsMSRyP+L06UuH7Htv2BSPXmTFLy8alJrAKLo19SllAr6m5ZP3OWy9eRDvp+oa4ZA3J9JX+isLwhjDkF1Q+aes+99JQ6E7W5hL8qvDAHCwaKgIo1IRMHJEVvZNsKqn+ME9EBDD1WyTNzik/qEOj2Cr9TXxmps8zD0VcngBAhdAv39R6EAPnVfRf1Goyagp6gPsCOeulh58jgrvAZ7L89u1J5yZY4C2Cu9js9UJwp46pdgU5qDDM=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILCzsFh+ZK0hqueDU2gWvb+j6m7hD/RYc8+thzHnJPmj#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN8lUi9ZvyyCZ7KdPvA7WBYtjDR8VhQzZuiukEvvpoRp0UJKIzVf11cXzP5sRkLnexUeWiXTv+jZK8hoAN9Othc=#012 create=True mode=0644 path=/tmp/ansible.p_slgdzj state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:52 np0005540741 python3.9[68993]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.p_slgdzj' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:10:52 np0005540741 python3.9[69147]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.p_slgdzj state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:53 np0005540741 systemd[1]: session-15.scope: Deactivated successfully.
Dec  1 04:10:53 np0005540741 systemd[1]: session-15.scope: Consumed 3.015s CPU time.
Dec  1 04:10:53 np0005540741 systemd-logind[788]: Session 15 logged out. Waiting for processes to exit.
Dec  1 04:10:53 np0005540741 systemd-logind[788]: Removed session 15.
Dec  1 04:10:53 np0005540741 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  1 04:11:00 np0005540741 systemd-logind[788]: New session 16 of user zuul.
Dec  1 04:11:00 np0005540741 systemd[1]: Started Session 16 of User zuul.
Dec  1 04:11:01 np0005540741 python3.9[69327]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:11:02 np0005540741 python3.9[69485]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  1 04:11:03 np0005540741 python3.9[69639]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:11:04 np0005540741 python3.9[69792]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:11:05 np0005540741 python3.9[69945]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:11:05 np0005540741 python3.9[70099]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:11:06 np0005540741 python3.9[70254]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:11:06 np0005540741 systemd[1]: session-16.scope: Deactivated successfully.
Dec  1 04:11:06 np0005540741 systemd[1]: session-16.scope: Consumed 4.105s CPU time.
Dec  1 04:11:06 np0005540741 systemd-logind[788]: Session 16 logged out. Waiting for processes to exit.
Dec  1 04:11:06 np0005540741 systemd-logind[788]: Removed session 16.
Dec  1 04:11:12 np0005540741 systemd-logind[788]: New session 17 of user zuul.
Dec  1 04:11:12 np0005540741 systemd[1]: Started Session 17 of User zuul.
Dec  1 04:11:13 np0005540741 python3.9[70432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:11:14 np0005540741 python3.9[70588]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:11:15 np0005540741 python3.9[70672]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  1 04:11:17 np0005540741 python3.9[70823]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:11:18 np0005540741 python3.9[70974]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 04:11:19 np0005540741 python3.9[71124]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:11:19 np0005540741 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:11:19 np0005540741 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:11:19 np0005540741 python3.9[71275]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:11:20 np0005540741 systemd[1]: session-17.scope: Deactivated successfully.
Dec  1 04:11:20 np0005540741 systemd[1]: session-17.scope: Consumed 5.523s CPU time.
Dec  1 04:11:20 np0005540741 systemd-logind[788]: Session 17 logged out. Waiting for processes to exit.
Dec  1 04:11:20 np0005540741 systemd-logind[788]: Removed session 17.
Dec  1 04:11:27 np0005540741 systemd-logind[788]: New session 18 of user zuul.
Dec  1 04:11:27 np0005540741 systemd[1]: Started Session 18 of User zuul.
Dec  1 04:11:33 np0005540741 python3[72041]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:11:34 np0005540741 python3[72136]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  1 04:11:36 np0005540741 python3[72163]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  1 04:11:36 np0005540741 python3[72189]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:11:36 np0005540741 kernel: loop: module loaded
Dec  1 04:11:36 np0005540741 kernel: loop3: detected capacity change from 0 to 41943040
Dec  1 04:11:36 np0005540741 python3[72225]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:11:36 np0005540741 lvm[72228]: PV /dev/loop3 not used.
Dec  1 04:11:37 np0005540741 lvm[72237]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:11:37 np0005540741 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Dec  1 04:11:37 np0005540741 lvm[72239]:  1 logical volume(s) in volume group "ceph_vg0" now active
Dec  1 04:11:37 np0005540741 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Dec  1 04:11:37 np0005540741 python3[72317]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:11:37 np0005540741 python3[72390]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580297.2150416-36106-155201303425810/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:11:38 np0005540741 python3[72440]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:11:38 np0005540741 systemd[1]: Reloading.
Dec  1 04:11:38 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:11:38 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:11:38 np0005540741 systemd[1]: Starting Ceph OSD losetup...
Dec  1 04:11:38 np0005540741 bash[72480]: /dev/loop3: [64513]:4194937 (/var/lib/ceph-osd-0.img)
Dec  1 04:11:38 np0005540741 systemd[1]: Finished Ceph OSD losetup.
Dec  1 04:11:38 np0005540741 lvm[72481]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:11:38 np0005540741 lvm[72481]: VG ceph_vg0 finished
Dec  1 04:11:39 np0005540741 python3[72507]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  1 04:11:40 np0005540741 python3[72534]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  1 04:11:41 np0005540741 python3[72560]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=20G#012losetup /dev/loop4 /var/lib/ceph-osd-1.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:11:41 np0005540741 kernel: loop4: detected capacity change from 0 to 41943040
Dec  1 04:11:41 np0005540741 python3[72592]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4#012vgcreate ceph_vg1 /dev/loop4#012lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:11:41 np0005540741 lvm[72595]: PV /dev/loop4 not used.
Dec  1 04:11:41 np0005540741 lvm[72605]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec  1 04:11:41 np0005540741 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1.
Dec  1 04:11:41 np0005540741 lvm[72607]:  1 logical volume(s) in volume group "ceph_vg1" now active
Dec  1 04:11:41 np0005540741 systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully.
Dec  1 04:11:42 np0005540741 python3[72685]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:11:42 np0005540741 python3[72758]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580301.7458215-36133-278484422018864/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:11:42 np0005540741 python3[72808]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:11:42 np0005540741 systemd[1]: Reloading.
Dec  1 04:11:42 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:11:42 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:11:43 np0005540741 systemd[1]: Starting Ceph OSD losetup...
Dec  1 04:11:43 np0005540741 bash[72848]: /dev/loop4: [64513]:4327981 (/var/lib/ceph-osd-1.img)
Dec  1 04:11:43 np0005540741 systemd[1]: Finished Ceph OSD losetup.
Dec  1 04:11:43 np0005540741 lvm[72849]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec  1 04:11:43 np0005540741 lvm[72849]: VG ceph_vg1 finished
Dec  1 04:11:43 np0005540741 python3[72875]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  1 04:11:45 np0005540741 python3[72902]: ansible-ansible.builtin.stat Invoked with path=/dev/loop5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  1 04:11:45 np0005540741 python3[72928]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-2.img bs=1 count=0 seek=20G#012losetup /dev/loop5 /var/lib/ceph-osd-2.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:11:45 np0005540741 kernel: loop5: detected capacity change from 0 to 41943040
Dec  1 04:11:45 np0005540741 python3[72960]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop5#012vgcreate ceph_vg2 /dev/loop5#012lvcreate -n ceph_lv2 -l +100%FREE ceph_vg2#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:11:45 np0005540741 lvm[72963]: PV /dev/loop5 not used.
Dec  1 04:11:45 np0005540741 lvm[72965]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec  1 04:11:45 np0005540741 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg2.
Dec  1 04:11:45 np0005540741 lvm[72975]:  1 logical volume(s) in volume group "ceph_vg2" now active
Dec  1 04:11:46 np0005540741 systemd[1]: lvm-activate-ceph_vg2.service: Deactivated successfully.
Dec  1 04:11:46 np0005540741 python3[73053]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-2.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:11:46 np0005540741 python3[73126]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580306.152131-36160-179906285034520/source dest=/etc/systemd/system/ceph-osd-losetup-2.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=4c5b1bc5693c499ffe2edaa97d63f5df7075d845 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:11:47 np0005540741 python3[73176]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-2.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:11:47 np0005540741 systemd[1]: Reloading.
Dec  1 04:11:47 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:11:47 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:11:47 np0005540741 systemd[1]: Starting Ceph OSD losetup...
Dec  1 04:11:47 np0005540741 bash[73215]: /dev/loop5: [64513]:4327982 (/var/lib/ceph-osd-2.img)
Dec  1 04:11:47 np0005540741 systemd[1]: Finished Ceph OSD losetup.
Dec  1 04:11:47 np0005540741 lvm[73216]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec  1 04:11:47 np0005540741 lvm[73216]: VG ceph_vg2 finished
Dec  1 04:11:49 np0005540741 chronyd[58516]: Selected source 162.159.200.1 (pool.ntp.org)
Dec  1 04:11:49 np0005540741 python3[73240]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:11:51 np0005540741 python3[73333]: ansible-ansible.legacy.dnf Invoked with name=['cephadm'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  1 04:11:53 np0005540741 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:11:53 np0005540741 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:11:53 np0005540741 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:11:53 np0005540741 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:11:53 np0005540741 systemd[1]: run-r370aa678a822416aa3de8c07224b703c.service: Deactivated successfully.
Dec  1 04:11:53 np0005540741 python3[73444]: ansible-ansible.builtin.stat Invoked with path=/usr/sbin/cephadm follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  1 04:11:54 np0005540741 python3[73472]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/cephadm ls --no-detail _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:11:54 np0005540741 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:11:54 np0005540741 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:11:55 np0005540741 python3[73535]: ansible-ansible.builtin.file Invoked with path=/etc/ceph state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:11:55 np0005540741 python3[73561]: ansible-ansible.builtin.file Invoked with path=/home/ceph-admin/specs owner=ceph-admin group=ceph-admin mode=0755 state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:11:55 np0005540741 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:11:55 np0005540741 python3[73639]: ansible-ansible.legacy.stat Invoked with path=/home/ceph-admin/specs/ceph_spec.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:11:56 np0005540741 python3[73712]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580315.6917045-36307-268234461242790/source dest=/home/ceph-admin/specs/ceph_spec.yaml owner=ceph-admin group=ceph-admin mode=0644 _original_basename=ceph_spec.yml follow=False checksum=bb83c53af4ffd926a3f1eafe26a8be437df6401f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:11:57 np0005540741 python3[73814]: ansible-ansible.legacy.stat Invoked with path=/home/ceph-admin/assimilate_ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:11:57 np0005540741 python3[73887]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580316.8206906-36325-132693539641527/source dest=/home/ceph-admin/assimilate_ceph.conf owner=ceph-admin group=ceph-admin mode=0644 _original_basename=initial_ceph.conf follow=False checksum=41828f7c2442fdf376911255e33c12863fc3b1b3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:11:57 np0005540741 python3[73937]: ansible-ansible.builtin.stat Invoked with path=/home/ceph-admin/.ssh/id_rsa follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  1 04:11:58 np0005540741 python3[73965]: ansible-ansible.builtin.stat Invoked with path=/home/ceph-admin/.ssh/id_rsa.pub follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  1 04:11:58 np0005540741 python3[73993]: ansible-ansible.builtin.stat Invoked with path=/home/ceph-admin/assimilate_ceph.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  1 04:11:58 np0005540741 python3[74021]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/cephadm bootstrap --skip-firewalld --skip-prepare-host --ssh-private-key /home/ceph-admin/.ssh/id_rsa --ssh-public-key /home/ceph-admin/.ssh/id_rsa.pub --ssh-user ceph-admin --allow-fqdn-hostname --output-keyring /etc/ceph/ceph.client.admin.keyring --output-config /etc/ceph/ceph.conf --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config /home/ceph-admin/assimilate_ceph.conf \--single-host-defaults \--skip-monitoring-stack --skip-dashboard --mon-ip 192.168.122.100#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:11:58 np0005540741 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:11:59 np0005540741 systemd-logind[788]: New session 19 of user ceph-admin.
Dec  1 04:11:59 np0005540741 systemd[1]: Created slice User Slice of UID 42477.
Dec  1 04:11:59 np0005540741 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec  1 04:11:59 np0005540741 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec  1 04:11:59 np0005540741 systemd[1]: Starting User Manager for UID 42477...
Dec  1 04:11:59 np0005540741 systemd[74040]: Queued start job for default target Main User Target.
Dec  1 04:11:59 np0005540741 systemd[74040]: Created slice User Application Slice.
Dec  1 04:11:59 np0005540741 systemd[74040]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  1 04:11:59 np0005540741 systemd[74040]: Started Daily Cleanup of User's Temporary Directories.
Dec  1 04:11:59 np0005540741 systemd[74040]: Reached target Paths.
Dec  1 04:11:59 np0005540741 systemd[74040]: Reached target Timers.
Dec  1 04:11:59 np0005540741 systemd[74040]: Starting D-Bus User Message Bus Socket...
Dec  1 04:11:59 np0005540741 systemd[74040]: Starting Create User's Volatile Files and Directories...
Dec  1 04:11:59 np0005540741 systemd[74040]: Listening on D-Bus User Message Bus Socket.
Dec  1 04:11:59 np0005540741 systemd[74040]: Reached target Sockets.
Dec  1 04:11:59 np0005540741 systemd[74040]: Finished Create User's Volatile Files and Directories.
Dec  1 04:11:59 np0005540741 systemd[74040]: Reached target Basic System.
Dec  1 04:11:59 np0005540741 systemd[74040]: Reached target Main User Target.
Dec  1 04:11:59 np0005540741 systemd[74040]: Startup finished in 121ms.
Dec  1 04:11:59 np0005540741 systemd[1]: Started User Manager for UID 42477.
Dec  1 04:11:59 np0005540741 systemd[1]: Started Session 19 of User ceph-admin.
Dec  1 04:11:59 np0005540741 systemd[1]: session-19.scope: Deactivated successfully.
Dec  1 04:11:59 np0005540741 systemd-logind[788]: Session 19 logged out. Waiting for processes to exit.
Dec  1 04:11:59 np0005540741 systemd-logind[788]: Removed session 19.
Dec  1 04:12:01 np0005540741 systemd[1]: var-lib-containers-storage-overlay-compat4187034717-merged.mount: Deactivated successfully.
Dec  1 04:12:02 np0005540741 systemd[1]: var-lib-containers-storage-overlay-compat4187034717-lower\x2dmapped.mount: Deactivated successfully.
Dec  1 04:12:09 np0005540741 systemd[1]: Stopping User Manager for UID 42477...
Dec  1 04:12:09 np0005540741 systemd[74040]: Activating special unit Exit the Session...
Dec  1 04:12:09 np0005540741 systemd[74040]: Stopped target Main User Target.
Dec  1 04:12:09 np0005540741 systemd[74040]: Stopped target Basic System.
Dec  1 04:12:09 np0005540741 systemd[74040]: Stopped target Paths.
Dec  1 04:12:09 np0005540741 systemd[74040]: Stopped target Sockets.
Dec  1 04:12:09 np0005540741 systemd[74040]: Stopped target Timers.
Dec  1 04:12:09 np0005540741 systemd[74040]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec  1 04:12:09 np0005540741 systemd[74040]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  1 04:12:09 np0005540741 systemd[74040]: Closed D-Bus User Message Bus Socket.
Dec  1 04:12:09 np0005540741 systemd[74040]: Stopped Create User's Volatile Files and Directories.
Dec  1 04:12:09 np0005540741 systemd[74040]: Removed slice User Application Slice.
Dec  1 04:12:09 np0005540741 systemd[74040]: Reached target Shutdown.
Dec  1 04:12:09 np0005540741 systemd[74040]: Finished Exit the Session.
Dec  1 04:12:09 np0005540741 systemd[74040]: Reached target Exit the Session.
Dec  1 04:12:09 np0005540741 systemd[1]: user@42477.service: Deactivated successfully.
Dec  1 04:12:09 np0005540741 systemd[1]: Stopped User Manager for UID 42477.
Dec  1 04:12:09 np0005540741 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Dec  1 04:12:09 np0005540741 systemd[1]: run-user-42477.mount: Deactivated successfully.
Dec  1 04:12:09 np0005540741 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Dec  1 04:12:09 np0005540741 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Dec  1 04:12:09 np0005540741 systemd[1]: Removed slice User Slice of UID 42477.
Dec  1 04:12:16 np0005540741 podman[74093]: 2025-12-01 09:12:16.851549721 +0000 UTC m=+17.513510633 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:16 np0005540741 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:12:16 np0005540741 podman[74153]: 2025-12-01 09:12:16.913325658 +0000 UTC m=+0.039011561 container create 27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa (image=quay.io/ceph/ceph:v18, name=keen_hawking, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3)
Dec  1 04:12:16 np0005540741 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec  1 04:12:16 np0005540741 systemd[1]: Started libpod-conmon-27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa.scope.
Dec  1 04:12:16 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:16 np0005540741 podman[74153]: 2025-12-01 09:12:16.896904341 +0000 UTC m=+0.022590264 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:17 np0005540741 podman[74153]: 2025-12-01 09:12:17.009672137 +0000 UTC m=+0.135358060 container init 27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa (image=quay.io/ceph/ceph:v18, name=keen_hawking, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True)
Dec  1 04:12:17 np0005540741 podman[74153]: 2025-12-01 09:12:17.01644614 +0000 UTC m=+0.142132043 container start 27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa (image=quay.io/ceph/ceph:v18, name=keen_hawking, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef)
Dec  1 04:12:17 np0005540741 podman[74153]: 2025-12-01 09:12:17.019602359 +0000 UTC m=+0.145288262 container attach 27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa (image=quay.io/ceph/ceph:v18, name=keen_hawking, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec  1 04:12:17 np0005540741 keen_hawking[74169]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable)
Dec  1 04:12:17 np0005540741 systemd[1]: libpod-27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa.scope: Deactivated successfully.
Dec  1 04:12:17 np0005540741 podman[74153]: 2025-12-01 09:12:17.353988947 +0000 UTC m=+0.479674850 container died 27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa (image=quay.io/ceph/ceph:v18, name=keen_hawking, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec  1 04:12:17 np0005540741 systemd[1]: var-lib-containers-storage-overlay-c97369b947701ee14cc2d4e17e340c0afdd338111f12620cda00af2df1961ddc-merged.mount: Deactivated successfully.
Dec  1 04:12:17 np0005540741 podman[74153]: 2025-12-01 09:12:17.396317271 +0000 UTC m=+0.522003174 container remove 27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa (image=quay.io/ceph/ceph:v18, name=keen_hawking, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec  1 04:12:17 np0005540741 systemd[1]: libpod-conmon-27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa.scope: Deactivated successfully.
Dec  1 04:12:17 np0005540741 podman[74184]: 2025-12-01 09:12:17.458692154 +0000 UTC m=+0.043256731 container create e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15 (image=quay.io/ceph/ceph:v18, name=modest_goodall, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec  1 04:12:17 np0005540741 systemd[1]: Started libpod-conmon-e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15.scope.
Dec  1 04:12:17 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:17 np0005540741 podman[74184]: 2025-12-01 09:12:17.523982561 +0000 UTC m=+0.108547138 container init e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15 (image=quay.io/ceph/ceph:v18, name=modest_goodall, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:12:17 np0005540741 podman[74184]: 2025-12-01 09:12:17.531824714 +0000 UTC m=+0.116389281 container start e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15 (image=quay.io/ceph/ceph:v18, name=modest_goodall, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:12:17 np0005540741 podman[74184]: 2025-12-01 09:12:17.438606123 +0000 UTC m=+0.023170710 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:17 np0005540741 podman[74184]: 2025-12-01 09:12:17.534708896 +0000 UTC m=+0.119273463 container attach e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15 (image=quay.io/ceph/ceph:v18, name=modest_goodall, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:12:17 np0005540741 modest_goodall[74200]: 167 167
Dec  1 04:12:17 np0005540741 systemd[1]: libpod-e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15.scope: Deactivated successfully.
Dec  1 04:12:17 np0005540741 podman[74184]: 2025-12-01 09:12:17.535842038 +0000 UTC m=+0.120406625 container died e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15 (image=quay.io/ceph/ceph:v18, name=modest_goodall, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec  1 04:12:17 np0005540741 podman[74184]: 2025-12-01 09:12:17.566662194 +0000 UTC m=+0.151226761 container remove e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15 (image=quay.io/ceph/ceph:v18, name=modest_goodall, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec  1 04:12:17 np0005540741 systemd[1]: libpod-conmon-e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15.scope: Deactivated successfully.
Dec  1 04:12:17 np0005540741 podman[74217]: 2025-12-01 09:12:17.62597045 +0000 UTC m=+0.042411957 container create 7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66 (image=quay.io/ceph/ceph:v18, name=quirky_khorana, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec  1 04:12:17 np0005540741 systemd[1]: Started libpod-conmon-7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66.scope.
Dec  1 04:12:17 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:17 np0005540741 podman[74217]: 2025-12-01 09:12:17.674739457 +0000 UTC m=+0.091180964 container init 7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66 (image=quay.io/ceph/ceph:v18, name=quirky_khorana, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  1 04:12:17 np0005540741 podman[74217]: 2025-12-01 09:12:17.679382469 +0000 UTC m=+0.095823976 container start 7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66 (image=quay.io/ceph/ceph:v18, name=quirky_khorana, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Dec  1 04:12:17 np0005540741 podman[74217]: 2025-12-01 09:12:17.682677923 +0000 UTC m=+0.099119430 container attach 7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66 (image=quay.io/ceph/ceph:v18, name=quirky_khorana, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Dec  1 04:12:17 np0005540741 quirky_khorana[74234]: AQDxWy1pI9WqKRAAmTLyJtNSzDfZ1RFM9WLt+A==
Dec  1 04:12:17 np0005540741 podman[74217]: 2025-12-01 09:12:17.602890404 +0000 UTC m=+0.019331931 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:17 np0005540741 systemd[1]: libpod-7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66.scope: Deactivated successfully.
Dec  1 04:12:17 np0005540741 podman[74217]: 2025-12-01 09:12:17.702007612 +0000 UTC m=+0.118449119 container died 7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66 (image=quay.io/ceph/ceph:v18, name=quirky_khorana, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:12:17 np0005540741 podman[74217]: 2025-12-01 09:12:17.733491148 +0000 UTC m=+0.149932655 container remove 7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66 (image=quay.io/ceph/ceph:v18, name=quirky_khorana, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:12:17 np0005540741 systemd[1]: libpod-conmon-7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66.scope: Deactivated successfully.
Dec  1 04:12:17 np0005540741 podman[74252]: 2025-12-01 09:12:17.784861678 +0000 UTC m=+0.034007928 container create 8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f (image=quay.io/ceph/ceph:v18, name=reverent_banach, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec  1 04:12:17 np0005540741 systemd[1]: Started libpod-conmon-8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f.scope.
Dec  1 04:12:17 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:17 np0005540741 podman[74252]: 2025-12-01 09:12:17.83558384 +0000 UTC m=+0.084730120 container init 8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f (image=quay.io/ceph/ceph:v18, name=reverent_banach, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec  1 04:12:17 np0005540741 podman[74252]: 2025-12-01 09:12:17.840984704 +0000 UTC m=+0.090130944 container start 8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f (image=quay.io/ceph/ceph:v18, name=reverent_banach, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:12:17 np0005540741 podman[74252]: 2025-12-01 09:12:17.844408581 +0000 UTC m=+0.093554851 container attach 8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f (image=quay.io/ceph/ceph:v18, name=reverent_banach, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:12:17 np0005540741 reverent_banach[74268]: AQDxWy1pP3FMMxAAfPV/KsFRiaST9nKEJ2+JCg==
Dec  1 04:12:17 np0005540741 systemd[1]: libpod-8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f.scope: Deactivated successfully.
Dec  1 04:12:17 np0005540741 podman[74252]: 2025-12-01 09:12:17.863820833 +0000 UTC m=+0.112967083 container died 8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f (image=quay.io/ceph/ceph:v18, name=reverent_banach, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True)
Dec  1 04:12:17 np0005540741 podman[74252]: 2025-12-01 09:12:17.769839011 +0000 UTC m=+0.018985281 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:17 np0005540741 systemd[1]: var-lib-containers-storage-overlay-a105e0c8dd2ddef39ac4794ab31d183fe03c16bf9f2c14788fd5ed9a85f5209c-merged.mount: Deactivated successfully.
Dec  1 04:12:17 np0005540741 podman[74252]: 2025-12-01 09:12:17.897687016 +0000 UTC m=+0.146833266 container remove 8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f (image=quay.io/ceph/ceph:v18, name=reverent_banach, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:12:17 np0005540741 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:12:17 np0005540741 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:12:17 np0005540741 systemd[1]: libpod-conmon-8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f.scope: Deactivated successfully.
Dec  1 04:12:17 np0005540741 podman[74289]: 2025-12-01 09:12:17.953772801 +0000 UTC m=+0.038607009 container create 8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611 (image=quay.io/ceph/ceph:v18, name=youthful_swirles, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec  1 04:12:17 np0005540741 systemd[1]: Started libpod-conmon-8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611.scope.
Dec  1 04:12:17 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:18 np0005540741 podman[74289]: 2025-12-01 09:12:18.007680574 +0000 UTC m=+0.092514782 container init 8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611 (image=quay.io/ceph/ceph:v18, name=youthful_swirles, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec  1 04:12:18 np0005540741 podman[74289]: 2025-12-01 09:12:18.013007115 +0000 UTC m=+0.097841323 container start 8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611 (image=quay.io/ceph/ceph:v18, name=youthful_swirles, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Dec  1 04:12:18 np0005540741 podman[74289]: 2025-12-01 09:12:18.01600266 +0000 UTC m=+0.100836898 container attach 8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611 (image=quay.io/ceph/ceph:v18, name=youthful_swirles, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:12:18 np0005540741 podman[74289]: 2025-12-01 09:12:17.936910711 +0000 UTC m=+0.021744939 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:18 np0005540741 youthful_swirles[74306]: AQDyWy1pRr36ARAAZpgZl//Z9xb8bhYxEGCeew==
Dec  1 04:12:18 np0005540741 systemd[1]: libpod-8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611.scope: Deactivated successfully.
Dec  1 04:12:18 np0005540741 podman[74289]: 2025-12-01 09:12:18.03709935 +0000 UTC m=+0.121933558 container died 8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611 (image=quay.io/ceph/ceph:v18, name=youthful_swirles, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 04:12:18 np0005540741 podman[74289]: 2025-12-01 09:12:18.067021731 +0000 UTC m=+0.151855939 container remove 8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611 (image=quay.io/ceph/ceph:v18, name=youthful_swirles, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec  1 04:12:18 np0005540741 systemd[1]: libpod-conmon-8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611.scope: Deactivated successfully.
Dec  1 04:12:18 np0005540741 podman[74324]: 2025-12-01 09:12:18.119576545 +0000 UTC m=+0.035806179 container create 7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465 (image=quay.io/ceph/ceph:v18, name=trusting_margulis, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:12:18 np0005540741 systemd[1]: Started libpod-conmon-7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465.scope.
Dec  1 04:12:18 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5960d03bd29c24ef9aef0727615e0a698c519e73fbc9cf4fc532b37b74ef8a80/merged/tmp/monmap supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:18 np0005540741 podman[74324]: 2025-12-01 09:12:18.171684397 +0000 UTC m=+0.087914051 container init 7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465 (image=quay.io/ceph/ceph:v18, name=trusting_margulis, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec  1 04:12:18 np0005540741 podman[74324]: 2025-12-01 09:12:18.176424682 +0000 UTC m=+0.092654316 container start 7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465 (image=quay.io/ceph/ceph:v18, name=trusting_margulis, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec  1 04:12:18 np0005540741 podman[74324]: 2025-12-01 09:12:18.178999375 +0000 UTC m=+0.095228999 container attach 7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465 (image=quay.io/ceph/ceph:v18, name=trusting_margulis, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Dec  1 04:12:18 np0005540741 podman[74324]: 2025-12-01 09:12:18.104647711 +0000 UTC m=+0.020877375 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:18 np0005540741 trusting_margulis[74340]: /usr/bin/monmaptool: monmap file /tmp/monmap
Dec  1 04:12:18 np0005540741 trusting_margulis[74340]: setting min_mon_release = pacific
Dec  1 04:12:18 np0005540741 trusting_margulis[74340]: /usr/bin/monmaptool: set fsid to 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec  1 04:12:18 np0005540741 trusting_margulis[74340]: /usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors)
Dec  1 04:12:18 np0005540741 systemd[1]: libpod-7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465.scope: Deactivated successfully.
Dec  1 04:12:18 np0005540741 podman[74324]: 2025-12-01 09:12:18.208229096 +0000 UTC m=+0.124458730 container died 7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465 (image=quay.io/ceph/ceph:v18, name=trusting_margulis, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec  1 04:12:18 np0005540741 podman[74324]: 2025-12-01 09:12:18.236249433 +0000 UTC m=+0.152479067 container remove 7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465 (image=quay.io/ceph/ceph:v18, name=trusting_margulis, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:12:18 np0005540741 systemd[1]: libpod-conmon-7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465.scope: Deactivated successfully.
Dec  1 04:12:18 np0005540741 podman[74359]: 2025-12-01 09:12:18.297964177 +0000 UTC m=+0.043401695 container create b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79 (image=quay.io/ceph/ceph:v18, name=objective_jackson, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:12:18 np0005540741 systemd[1]: Started libpod-conmon-b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79.scope.
Dec  1 04:12:18 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/084135f1ce3964ab6032809724ca6bef79e7091cafc3fc73faef4360db430854/merged/tmp/monmap supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/084135f1ce3964ab6032809724ca6bef79e7091cafc3fc73faef4360db430854/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/084135f1ce3964ab6032809724ca6bef79e7091cafc3fc73faef4360db430854/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/084135f1ce3964ab6032809724ca6bef79e7091cafc3fc73faef4360db430854/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:18 np0005540741 podman[74359]: 2025-12-01 09:12:18.349175603 +0000 UTC m=+0.094613041 container init b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79 (image=quay.io/ceph/ceph:v18, name=objective_jackson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True)
Dec  1 04:12:18 np0005540741 podman[74359]: 2025-12-01 09:12:18.356262095 +0000 UTC m=+0.101699503 container start b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79 (image=quay.io/ceph/ceph:v18, name=objective_jackson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:12:18 np0005540741 podman[74359]: 2025-12-01 09:12:18.360057293 +0000 UTC m=+0.105494691 container attach b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79 (image=quay.io/ceph/ceph:v18, name=objective_jackson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 04:12:18 np0005540741 podman[74359]: 2025-12-01 09:12:18.278200185 +0000 UTC m=+0.023637613 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:18 np0005540741 systemd[1]: libpod-b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79.scope: Deactivated successfully.
Dec  1 04:12:18 np0005540741 podman[74359]: 2025-12-01 09:12:18.419667798 +0000 UTC m=+0.165105206 container died b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79 (image=quay.io/ceph/ceph:v18, name=objective_jackson, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Dec  1 04:12:18 np0005540741 podman[74359]: 2025-12-01 09:12:18.456366361 +0000 UTC m=+0.201803769 container remove b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79 (image=quay.io/ceph/ceph:v18, name=objective_jackson, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:12:18 np0005540741 systemd[1]: libpod-conmon-b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79.scope: Deactivated successfully.
Dec  1 04:12:18 np0005540741 systemd[1]: Reloading.
Dec  1 04:12:18 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:12:18 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:12:18 np0005540741 systemd[1]: Reloading.
Dec  1 04:12:18 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:12:18 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:12:18 np0005540741 systemd[1]: Reached target All Ceph clusters and services.
Dec  1 04:12:18 np0005540741 systemd[1]: Reloading.
Dec  1 04:12:19 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:12:19 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:12:19 np0005540741 systemd[1]: Reached target Ceph cluster 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec  1 04:12:19 np0005540741 systemd[1]: Reloading.
Dec  1 04:12:19 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:12:19 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:12:19 np0005540741 systemd[1]: Reloading.
Dec  1 04:12:19 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:12:19 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:12:19 np0005540741 systemd[1]: Created slice Slice /system/ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec  1 04:12:19 np0005540741 systemd[1]: Reached target System Time Set.
Dec  1 04:12:19 np0005540741 systemd[1]: Reached target System Time Synchronized.
Dec  1 04:12:19 np0005540741 systemd[1]: Starting Ceph mon.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec  1 04:12:19 np0005540741 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:12:19 np0005540741 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:12:19 np0005540741 podman[74654]: 2025-12-01 09:12:19.952789128 +0000 UTC m=+0.049421386 container create cefa5d72af91d86283f64418a14a9aa1bf5270d72d08d9ff1301e3ba772fd855 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec  1 04:12:19 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c3e450d579be7900aad28102be8a81fcc3c93003a0b07817df9ad5b3f76dae6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:19 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c3e450d579be7900aad28102be8a81fcc3c93003a0b07817df9ad5b3f76dae6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:19 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c3e450d579be7900aad28102be8a81fcc3c93003a0b07817df9ad5b3f76dae6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:19 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c3e450d579be7900aad28102be8a81fcc3c93003a0b07817df9ad5b3f76dae6/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:20 np0005540741 podman[74654]: 2025-12-01 09:12:20.014472262 +0000 UTC m=+0.111104550 container init cefa5d72af91d86283f64418a14a9aa1bf5270d72d08d9ff1301e3ba772fd855 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True)
Dec  1 04:12:20 np0005540741 podman[74654]: 2025-12-01 09:12:20.019707151 +0000 UTC m=+0.116339399 container start cefa5d72af91d86283f64418a14a9aa1bf5270d72d08d9ff1301e3ba772fd855 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:12:20 np0005540741 podman[74654]: 2025-12-01 09:12:19.929529227 +0000 UTC m=+0.026161505 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:20 np0005540741 bash[74654]: cefa5d72af91d86283f64418a14a9aa1bf5270d72d08d9ff1301e3ba772fd855
Dec  1 04:12:20 np0005540741 systemd[1]: Started Ceph mon.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: pidfile_write: ignore empty --pid-file
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: load: jerasure load: lrc 
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: RocksDB version: 7.9.2
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Git sha 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Compile date 2025-05-06 23:30:25
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: DB SUMMARY
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: DB Session ID:  3E2ZTRY0NM64UCA3TJW4
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: CURRENT file:  CURRENT
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: IDENTITY file:  IDENTITY
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-0/store.db dir, Total Num: 0, files: 
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-0/store.db: 000004.log size: 807 ; 
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                         Options.error_if_exists: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                       Options.create_if_missing: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                         Options.paranoid_checks: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                                     Options.env: 0x557647b36c40
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                                      Options.fs: PosixFileSystem
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                                Options.info_log: 0x557649962e80
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                Options.max_file_opening_threads: 16
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                              Options.statistics: (nil)
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                               Options.use_fsync: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                       Options.max_log_file_size: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                         Options.allow_fallocate: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                        Options.use_direct_reads: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:          Options.create_missing_column_families: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                              Options.db_log_dir: 
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                                 Options.wal_dir: 
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                   Options.advise_random_on_open: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                    Options.write_buffer_manager: 0x557649972b40
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                            Options.rate_limiter: (nil)
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                  Options.unordered_write: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                               Options.row_cache: None
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                              Options.wal_filter: None
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.allow_ingest_behind: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.two_write_queues: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.manual_wal_flush: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.wal_compression: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.atomic_flush: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                 Options.log_readahead_size: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.allow_data_in_errors: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.db_host_id: __hostname__
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.max_background_jobs: 2
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.max_background_compactions: -1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.max_subcompactions: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.max_total_wal_size: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                          Options.max_open_files: -1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                          Options.bytes_per_sync: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:       Options.compaction_readahead_size: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                  Options.max_background_flushes: -1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Compression algorithms supported:
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: #011kZSTD supported: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: #011kXpressCompression supported: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: #011kBZip2Compression supported: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: #011kLZ4Compression supported: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: #011kZlibCompression supported: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: #011kSnappyCompression supported: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-0/store.db/MANIFEST-000005
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:           Options.merge_operator: 
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:        Options.compaction_filter: None
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557649962a80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55764995b1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:        Options.write_buffer_size: 33554432
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:  Options.max_write_buffer_number: 2
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:          Options.compression: NoCompression
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.num_levels: 7
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-0/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 45d3ecca-3e60-40df-8d21-b0b3630e7b99
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580340065528, "job": 1, "event": "recovery_started", "wal_files": [4]}
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580340067969, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1944, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 819, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 696, "raw_average_value_size": 139, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "3E2ZTRY0NM64UCA3TJW4", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580340068175, "job": 1, "event": "recovery_finished"}
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x557649984e00
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: DB pointer 0x557649a8e000
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.90 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.90 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.16 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.16 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55764995b1f0#2 capacity: 512.00 MB usage: 1.17 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 4.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(2,0.95 KB,0.000181794%)#012#012** File Read Latency Histogram By Level [default] **
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: starting mon.compute-0 rank 0 at public addrs [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] at bind addrs [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-0 fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@-1(???) e0 preinit fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@-1(probing) e0  my rank is now 0 (was -1)
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(probing) e0 win_standalone_election
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: paxos.0).electionLogic(0) init, first boot, initializing epoch at 1 
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(electing) e0 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: log_channel(cluster) log [INF] : mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 create_pending setting backfillfull_ratio = 0.9
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 create_pending setting full_ratio = 0.95
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 create_pending setting nearfull_ratio = 0.85
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 do_prune osdmap full prune enabled
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 encode_pending skipping prime_pg_temp; mapping job did not start
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader) e0 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).paxosservice(auth 0..0) refresh upgraded, format 3 -> 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(probing) e1 win_standalone_election
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: paxos.0).electionLogic(2) init, last seen epoch 2
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(electing) e1 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: log_channel(cluster) log [INF] : mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: log_channel(cluster) log [DBG] : monmap e1: 1 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0]} removed_ranks: {} disallowed_leaders: {}
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader) e1 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mgrc update_daemon_metadata mon.compute-0 metadata {addrs=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,ceph_version_when_created=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-0,container_image=quay.io/ceph/ceph:v18,cpu=AMD EPYC-Rome Processor,created_at=2025-12-01T09:12:18.391342Z,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-0,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864320,os=Linux}
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 create_pending setting backfillfull_ratio = 0.9
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 create_pending setting full_ratio = 0.95
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 create_pending setting nearfull_ratio = 0.85
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 do_prune osdmap full prune enabled
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 encode_pending skipping prime_pg_temp; mapping job did not start
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader) e1 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).mds e1 new map
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).mds e1 print_map#012e1#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: -1#012 #012No filesystems configured
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).paxosservice(auth 0..0) refresh upgraded, format 3 -> 0
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: log_channel(cluster) log [DBG] : fsmap 
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e1 e1: 0 total, 0 up, 0 in
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e1 crush map has features 3314932999778484224, adjusting msgr requires
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mkfs 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader).paxosservice(auth 1..1) refresh upgraded, format 0 -> 3
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: log_channel(cluster) log [DBG] : osdmap e1: 0 total, 0 up, 0 in
Dec  1 04:12:20 np0005540741 podman[74676]: 2025-12-01 09:12:20.115976008 +0000 UTC m=+0.039582376 container create c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2 (image=quay.io/ceph/ceph:v18, name=nifty_matsumoto, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: log_channel(cluster) log [DBG] : mgrmap e1: no daemons active
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec  1 04:12:20 np0005540741 systemd[1]: Started libpod-conmon-c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2.scope.
Dec  1 04:12:20 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab55e3c47e9c0f75159d73f9de288637a04c584ef936f9586d99797e42236072/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab55e3c47e9c0f75159d73f9de288637a04c584ef936f9586d99797e42236072/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab55e3c47e9c0f75159d73f9de288637a04c584ef936f9586d99797e42236072/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:20 np0005540741 podman[74676]: 2025-12-01 09:12:20.098724988 +0000 UTC m=+0.022331376 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:20 np0005540741 podman[74676]: 2025-12-01 09:12:20.197140716 +0000 UTC m=+0.120747084 container init c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2 (image=quay.io/ceph/ceph:v18, name=nifty_matsumoto, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:12:20 np0005540741 podman[74676]: 2025-12-01 09:12:20.205765631 +0000 UTC m=+0.129372000 container start c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2 (image=quay.io/ceph/ceph:v18, name=nifty_matsumoto, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:12:20 np0005540741 podman[74676]: 2025-12-01 09:12:20.208800348 +0000 UTC m=+0.132406716 container attach c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2 (image=quay.io/ceph/ceph:v18, name=nifty_matsumoto, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status"} v 0) v1
Dec  1 04:12:20 np0005540741 ceph-mon[74672]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1884213047' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec  1 04:12:20 np0005540741 nifty_matsumoto[74725]:  cluster:
Dec  1 04:12:20 np0005540741 nifty_matsumoto[74725]:    id:     5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec  1 04:12:20 np0005540741 nifty_matsumoto[74725]:    health: HEALTH_OK
Dec  1 04:12:20 np0005540741 nifty_matsumoto[74725]: 
Dec  1 04:12:20 np0005540741 nifty_matsumoto[74725]:  services:
Dec  1 04:12:20 np0005540741 nifty_matsumoto[74725]:    mon: 1 daemons, quorum compute-0 (age 0.516339s)
Dec  1 04:12:20 np0005540741 nifty_matsumoto[74725]:    mgr: no daemons active
Dec  1 04:12:20 np0005540741 nifty_matsumoto[74725]:    osd: 0 osds: 0 up, 0 in
Dec  1 04:12:20 np0005540741 nifty_matsumoto[74725]: 
Dec  1 04:12:20 np0005540741 nifty_matsumoto[74725]:  data:
Dec  1 04:12:20 np0005540741 nifty_matsumoto[74725]:    pools:   0 pools, 0 pgs
Dec  1 04:12:20 np0005540741 nifty_matsumoto[74725]:    objects: 0 objects, 0 B
Dec  1 04:12:20 np0005540741 nifty_matsumoto[74725]:    usage:   0 B used, 0 B / 0 B avail
Dec  1 04:12:20 np0005540741 nifty_matsumoto[74725]:    pgs:     
Dec  1 04:12:20 np0005540741 nifty_matsumoto[74725]: 
Dec  1 04:12:20 np0005540741 systemd[1]: libpod-c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2.scope: Deactivated successfully.
Dec  1 04:12:20 np0005540741 podman[74676]: 2025-12-01 09:12:20.632728271 +0000 UTC m=+0.556334629 container died c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2 (image=quay.io/ceph/ceph:v18, name=nifty_matsumoto, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:12:20 np0005540741 systemd[1]: var-lib-containers-storage-overlay-ab55e3c47e9c0f75159d73f9de288637a04c584ef936f9586d99797e42236072-merged.mount: Deactivated successfully.
Dec  1 04:12:20 np0005540741 podman[74676]: 2025-12-01 09:12:20.681709774 +0000 UTC m=+0.605316132 container remove c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2 (image=quay.io/ceph/ceph:v18, name=nifty_matsumoto, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:12:20 np0005540741 systemd[1]: libpod-conmon-c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2.scope: Deactivated successfully.
Dec  1 04:12:20 np0005540741 podman[74768]: 2025-12-01 09:12:20.743792709 +0000 UTC m=+0.042966052 container create c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2 (image=quay.io/ceph/ceph:v18, name=competent_albattani, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:12:20 np0005540741 systemd[1]: Started libpod-conmon-c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2.scope.
Dec  1 04:12:20 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a88ca7a93efb54ff26eec6476c77042d20cfef9fdedbd701a4f424ef3e3da96/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a88ca7a93efb54ff26eec6476c77042d20cfef9fdedbd701a4f424ef3e3da96/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a88ca7a93efb54ff26eec6476c77042d20cfef9fdedbd701a4f424ef3e3da96/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a88ca7a93efb54ff26eec6476c77042d20cfef9fdedbd701a4f424ef3e3da96/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:20 np0005540741 podman[74768]: 2025-12-01 09:12:20.815040125 +0000 UTC m=+0.114213478 container init c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2 (image=quay.io/ceph/ceph:v18, name=competent_albattani, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:12:20 np0005540741 podman[74768]: 2025-12-01 09:12:20.723648677 +0000 UTC m=+0.022822050 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:20 np0005540741 podman[74768]: 2025-12-01 09:12:20.819957995 +0000 UTC m=+0.119131338 container start c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2 (image=quay.io/ceph/ceph:v18, name=competent_albattani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Dec  1 04:12:20 np0005540741 podman[74768]: 2025-12-01 09:12:20.823359112 +0000 UTC m=+0.122532455 container attach c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2 (image=quay.io/ceph/ceph:v18, name=competent_albattani, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:12:21 np0005540741 ceph-mon[74672]: mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec  1 04:12:21 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config assimilate-conf"} v 0) v1
Dec  1 04:12:21 np0005540741 ceph-mon[74672]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/4273641964' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Dec  1 04:12:21 np0005540741 ceph-mon[74672]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/4273641964' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Dec  1 04:12:21 np0005540741 competent_albattani[74786]: 
Dec  1 04:12:21 np0005540741 competent_albattani[74786]: [global]
Dec  1 04:12:21 np0005540741 competent_albattani[74786]: #011fsid = 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec  1 04:12:21 np0005540741 competent_albattani[74786]: #011mon_host = [v2:192.168.122.100:3300,v1:192.168.122.100:6789]
Dec  1 04:12:21 np0005540741 competent_albattani[74786]: #011osd_crush_chooseleaf_type = 0
Dec  1 04:12:21 np0005540741 systemd[1]: libpod-c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2.scope: Deactivated successfully.
Dec  1 04:12:21 np0005540741 podman[74768]: 2025-12-01 09:12:21.23336844 +0000 UTC m=+0.532541793 container died c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2 (image=quay.io/ceph/ceph:v18, name=competent_albattani, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  1 04:12:21 np0005540741 systemd[1]: var-lib-containers-storage-overlay-1a88ca7a93efb54ff26eec6476c77042d20cfef9fdedbd701a4f424ef3e3da96-merged.mount: Deactivated successfully.
Dec  1 04:12:21 np0005540741 podman[74768]: 2025-12-01 09:12:21.276972109 +0000 UTC m=+0.576145442 container remove c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2 (image=quay.io/ceph/ceph:v18, name=competent_albattani, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True)
Dec  1 04:12:21 np0005540741 systemd[1]: libpod-conmon-c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2.scope: Deactivated successfully.
Dec  1 04:12:21 np0005540741 podman[74823]: 2025-12-01 09:12:21.336109291 +0000 UTC m=+0.038999280 container create ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa (image=quay.io/ceph/ceph:v18, name=quirky_khorana, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:12:21 np0005540741 systemd[1]: Started libpod-conmon-ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa.scope.
Dec  1 04:12:21 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:21 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab92c57de76ca6b6240b70a4f7d798d245ca818f7fb1dae48860aede3acf2bd4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:21 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab92c57de76ca6b6240b70a4f7d798d245ca818f7fb1dae48860aede3acf2bd4/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:21 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab92c57de76ca6b6240b70a4f7d798d245ca818f7fb1dae48860aede3acf2bd4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:21 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab92c57de76ca6b6240b70a4f7d798d245ca818f7fb1dae48860aede3acf2bd4/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:21 np0005540741 podman[74823]: 2025-12-01 09:12:21.411772032 +0000 UTC m=+0.114662071 container init ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa (image=quay.io/ceph/ceph:v18, name=quirky_khorana, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec  1 04:12:21 np0005540741 podman[74823]: 2025-12-01 09:12:21.32025736 +0000 UTC m=+0.023147369 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:21 np0005540741 podman[74823]: 2025-12-01 09:12:21.421436197 +0000 UTC m=+0.124326186 container start ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa (image=quay.io/ceph/ceph:v18, name=quirky_khorana, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec  1 04:12:21 np0005540741 podman[74823]: 2025-12-01 09:12:21.424780002 +0000 UTC m=+0.127669991 container attach ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa (image=quay.io/ceph/ceph:v18, name=quirky_khorana, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Dec  1 04:12:21 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:12:21 np0005540741 ceph-mon[74672]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3612672887' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:12:21 np0005540741 systemd[1]: libpod-ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa.scope: Deactivated successfully.
Dec  1 04:12:21 np0005540741 podman[74823]: 2025-12-01 09:12:21.823458608 +0000 UTC m=+0.526348597 container died ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa (image=quay.io/ceph/ceph:v18, name=quirky_khorana, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:12:21 np0005540741 systemd[1]: var-lib-containers-storage-overlay-ab92c57de76ca6b6240b70a4f7d798d245ca818f7fb1dae48860aede3acf2bd4-merged.mount: Deactivated successfully.
Dec  1 04:12:21 np0005540741 podman[74823]: 2025-12-01 09:12:21.868666113 +0000 UTC m=+0.571556102 container remove ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa (image=quay.io/ceph/ceph:v18, name=quirky_khorana, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  1 04:12:21 np0005540741 systemd[1]: libpod-conmon-ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa.scope: Deactivated successfully.
Dec  1 04:12:21 np0005540741 systemd[1]: Stopping Ceph mon.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec  1 04:12:22 np0005540741 ceph-mon[74672]: received  signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.compute-0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false  (PID: 1) UID: 0
Dec  1 04:12:22 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader) e1 *** Got Signal Terminated ***
Dec  1 04:12:22 np0005540741 ceph-mon[74672]: mon.compute-0@0(leader) e1 shutdown
Dec  1 04:12:22 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0[74668]: 2025-12-01T09:12:22.050+0000 7f739e8c7640 -1 received  signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.compute-0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false  (PID: 1) UID: 0
Dec  1 04:12:22 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0[74668]: 2025-12-01T09:12:22.050+0000 7f739e8c7640 -1 mon.compute-0@0(leader) e1 *** Got Signal Terminated ***
Dec  1 04:12:22 np0005540741 ceph-mon[74672]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec  1 04:12:22 np0005540741 ceph-mon[74672]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec  1 04:12:22 np0005540741 podman[74907]: 2025-12-01 09:12:22.306001828 +0000 UTC m=+0.287144645 container died cefa5d72af91d86283f64418a14a9aa1bf5270d72d08d9ff1301e3ba772fd855 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3)
Dec  1 04:12:22 np0005540741 systemd[1]: var-lib-containers-storage-overlay-6c3e450d579be7900aad28102be8a81fcc3c93003a0b07817df9ad5b3f76dae6-merged.mount: Deactivated successfully.
Dec  1 04:12:22 np0005540741 podman[74907]: 2025-12-01 09:12:22.34264093 +0000 UTC m=+0.323783757 container remove cefa5d72af91d86283f64418a14a9aa1bf5270d72d08d9ff1301e3ba772fd855 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:12:22 np0005540741 bash[74907]: ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0
Dec  1 04:12:22 np0005540741 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:12:22 np0005540741 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:12:22 np0005540741 systemd[1]: ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@mon.compute-0.service: Deactivated successfully.
Dec  1 04:12:22 np0005540741 systemd[1]: Stopped Ceph mon.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec  1 04:12:22 np0005540741 systemd[1]: Starting Ceph mon.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec  1 04:12:22 np0005540741 podman[75011]: 2025-12-01 09:12:22.663057701 +0000 UTC m=+0.034416970 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:22 np0005540741 podman[75011]: 2025-12-01 09:12:22.776721221 +0000 UTC m=+0.148080490 container create a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec  1 04:12:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25e1f43ce5c8d6f2ee734a07843f5f828178b98c2cd26e90afe9298f37e21e66/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25e1f43ce5c8d6f2ee734a07843f5f828178b98c2cd26e90afe9298f37e21e66/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25e1f43ce5c8d6f2ee734a07843f5f828178b98c2cd26e90afe9298f37e21e66/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25e1f43ce5c8d6f2ee734a07843f5f828178b98c2cd26e90afe9298f37e21e66/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:22 np0005540741 podman[75011]: 2025-12-01 09:12:22.835632557 +0000 UTC m=+0.206991836 container init a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec  1 04:12:22 np0005540741 podman[75011]: 2025-12-01 09:12:22.840773553 +0000 UTC m=+0.212132812 container start a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:12:22 np0005540741 bash[75011]: a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7
Dec  1 04:12:22 np0005540741 systemd[1]: Started Ceph mon.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: pidfile_write: ignore empty --pid-file
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: load: jerasure load: lrc 
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: RocksDB version: 7.9.2
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Git sha 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Compile date 2025-05-06 23:30:25
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: DB SUMMARY
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: DB Session ID:  2DUIFG3VBWNEITLEK8RC
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: CURRENT file:  CURRENT
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: IDENTITY file:  IDENTITY
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: MANIFEST file:  MANIFEST-000010 size: 179 Bytes
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-0/store.db dir, Total Num: 1, files: 000008.sst 
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-0/store.db: 000009.log size: 52078 ; 
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                         Options.error_if_exists: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                       Options.create_if_missing: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                         Options.paranoid_checks: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                                     Options.env: 0x55bbd48ffc40
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                                      Options.fs: PosixFileSystem
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                                Options.info_log: 0x55bbd56bd040
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                Options.max_file_opening_threads: 16
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                              Options.statistics: (nil)
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                               Options.use_fsync: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                       Options.max_log_file_size: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                         Options.allow_fallocate: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                        Options.use_direct_reads: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:          Options.create_missing_column_families: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                              Options.db_log_dir: 
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                                 Options.wal_dir: 
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                   Options.advise_random_on_open: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                    Options.write_buffer_manager: 0x55bbd56ccb40
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                            Options.rate_limiter: (nil)
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                  Options.unordered_write: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                               Options.row_cache: None
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                              Options.wal_filter: None
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.allow_ingest_behind: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.two_write_queues: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.manual_wal_flush: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.wal_compression: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.atomic_flush: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                 Options.log_readahead_size: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.allow_data_in_errors: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.db_host_id: __hostname__
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.max_background_jobs: 2
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.max_background_compactions: -1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.max_subcompactions: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.max_total_wal_size: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                          Options.max_open_files: -1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                          Options.bytes_per_sync: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:       Options.compaction_readahead_size: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                  Options.max_background_flushes: -1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Compression algorithms supported:
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: #011kZSTD supported: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: #011kXpressCompression supported: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: #011kBZip2Compression supported: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: #011kLZ4Compression supported: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: #011kZlibCompression supported: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: #011kSnappyCompression supported: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-0/store.db/MANIFEST-000010
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:           Options.merge_operator: 
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:        Options.compaction_filter: None
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bbd56bcc40)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bbd56b51f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:        Options.write_buffer_size: 33554432
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:  Options.max_write_buffer_number: 2
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:          Options.compression: NoCompression
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.num_levels: 7
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-0/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 12, last_sequence is 5, log_number is 5,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 5
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 45d3ecca-3e60-40df-8d21-b0b3630e7b99
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580342880545, "job": 1, "event": "recovery_started", "wal_files": [9]}
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #9 mode 2
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580342883395, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 13, "file_size": 51794, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8, "largest_seqno": 129, "table_properties": {"data_size": 50351, "index_size": 149, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 261, "raw_key_size": 2940, "raw_average_key_size": 30, "raw_value_size": 48030, "raw_average_value_size": 500, "num_data_blocks": 7, "num_entries": 96, "num_filter_entries": 96, "num_deletions": 3, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580342, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 13, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580342883504, "job": 1, "event": "recovery_finished"}
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:5047] Creating manifest 15
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55bbd56dee00
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: DB pointer 0x55bbd5768000
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0   52.48 KB   0.5      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     19.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012 Sum      2/0   52.48 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     19.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     19.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     19.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 4.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 4.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bbd56b51f0#2 capacity: 512.00 MB usage: 0.77 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(2,0.42 KB,8.04663e-05%) IndexBlock(2,0.34 KB,6.55651e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: starting mon.compute-0 rank 0 at public addrs [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] at bind addrs [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-0 fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: mon.compute-0@-1(???) e1 preinit fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: mon.compute-0@-1(???).mds e1 new map
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: mon.compute-0@-1(???).mds e1 print_map#012e1#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: -1#012 #012No filesystems configured
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: mon.compute-0@-1(???).osd e1 crush map has features 3314932999778484224, adjusting msgr requires
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: mon.compute-0@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: mon.compute-0@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: mon.compute-0@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: mon.compute-0@-1(???).paxosservice(auth 1..2) refresh upgraded, format 0 -> 3
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: mon.compute-0@-1(probing) e1  my rank is now 0 (was -1)
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(probing) e1 win_standalone_election
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: paxos.0).electionLogic(3) init, last seen epoch 3, mid-election, bumping
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(electing) e1 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : monmap e1: 1 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0]} removed_ranks: {} disallowed_leaders: {}
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : fsmap 
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e1: 0 total, 0 up, 0 in
Dec  1 04:12:22 np0005540741 podman[75032]: 2025-12-01 09:12:22.905702989 +0000 UTC m=+0.039463353 container create 8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744 (image=quay.io/ceph/ceph:v18, name=musing_haslett, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e1: no daemons active
Dec  1 04:12:22 np0005540741 systemd[1]: Started libpod-conmon-8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744.scope.
Dec  1 04:12:22 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f150891d01ab69b45cabda0c8dc4f9353fd69c804f4ea9e1dbe350002188fb7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f150891d01ab69b45cabda0c8dc4f9353fd69c804f4ea9e1dbe350002188fb7/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f150891d01ab69b45cabda0c8dc4f9353fd69c804f4ea9e1dbe350002188fb7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:22 np0005540741 ceph-mon[75031]: mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec  1 04:12:22 np0005540741 podman[75032]: 2025-12-01 09:12:22.964939073 +0000 UTC m=+0.098699437 container init 8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744 (image=quay.io/ceph/ceph:v18, name=musing_haslett, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Dec  1 04:12:22 np0005540741 podman[75032]: 2025-12-01 09:12:22.970039608 +0000 UTC m=+0.103799972 container start 8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744 (image=quay.io/ceph/ceph:v18, name=musing_haslett, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:12:22 np0005540741 podman[75032]: 2025-12-01 09:12:22.973117216 +0000 UTC m=+0.106877610 container attach 8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744 (image=quay.io/ceph/ceph:v18, name=musing_haslett, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  1 04:12:22 np0005540741 podman[75032]: 2025-12-01 09:12:22.889254611 +0000 UTC m=+0.023014975 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=public_network}] v 0) v1
Dec  1 04:12:23 np0005540741 systemd[1]: libpod-8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744.scope: Deactivated successfully.
Dec  1 04:12:23 np0005540741 conmon[75086]: conmon 8cd0c44f0f4f3aef750b <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744.scope/container/memory.events
Dec  1 04:12:23 np0005540741 podman[75032]: 2025-12-01 09:12:23.390853613 +0000 UTC m=+0.524613987 container died 8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744 (image=quay.io/ceph/ceph:v18, name=musing_haslett, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec  1 04:12:23 np0005540741 systemd[1]: var-lib-containers-storage-overlay-4f150891d01ab69b45cabda0c8dc4f9353fd69c804f4ea9e1dbe350002188fb7-merged.mount: Deactivated successfully.
Dec  1 04:12:23 np0005540741 podman[75032]: 2025-12-01 09:12:23.478761973 +0000 UTC m=+0.612522337 container remove 8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744 (image=quay.io/ceph/ceph:v18, name=musing_haslett, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Dec  1 04:12:23 np0005540741 systemd[1]: libpod-conmon-8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744.scope: Deactivated successfully.
Dec  1 04:12:23 np0005540741 podman[75124]: 2025-12-01 09:12:23.531702198 +0000 UTC m=+0.036581961 container create a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8 (image=quay.io/ceph/ceph:v18, name=beautiful_pare, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec  1 04:12:23 np0005540741 systemd[1]: Started libpod-conmon-a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8.scope.
Dec  1 04:12:23 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:23 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca66f84caa9593e9a3ba3d861c10f5d0b5cec7885cd1206298467e83650d6127/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:23 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca66f84caa9593e9a3ba3d861c10f5d0b5cec7885cd1206298467e83650d6127/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:23 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca66f84caa9593e9a3ba3d861c10f5d0b5cec7885cd1206298467e83650d6127/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:23 np0005540741 podman[75124]: 2025-12-01 09:12:23.594898815 +0000 UTC m=+0.099778638 container init a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8 (image=quay.io/ceph/ceph:v18, name=beautiful_pare, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec  1 04:12:23 np0005540741 podman[75124]: 2025-12-01 09:12:23.601010409 +0000 UTC m=+0.105890162 container start a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8 (image=quay.io/ceph/ceph:v18, name=beautiful_pare, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:12:23 np0005540741 podman[75124]: 2025-12-01 09:12:23.604125277 +0000 UTC m=+0.109005050 container attach a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8 (image=quay.io/ceph/ceph:v18, name=beautiful_pare, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:12:23 np0005540741 podman[75124]: 2025-12-01 09:12:23.51630547 +0000 UTC m=+0.021185243 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=cluster_network}] v 0) v1
Dec  1 04:12:24 np0005540741 systemd[1]: libpod-a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8.scope: Deactivated successfully.
Dec  1 04:12:24 np0005540741 podman[75124]: 2025-12-01 09:12:24.005195991 +0000 UTC m=+0.510075744 container died a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8 (image=quay.io/ceph/ceph:v18, name=beautiful_pare, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:12:24 np0005540741 systemd[1]: var-lib-containers-storage-overlay-ca66f84caa9593e9a3ba3d861c10f5d0b5cec7885cd1206298467e83650d6127-merged.mount: Deactivated successfully.
Dec  1 04:12:24 np0005540741 podman[75124]: 2025-12-01 09:12:24.212196907 +0000 UTC m=+0.717076660 container remove a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8 (image=quay.io/ceph/ceph:v18, name=beautiful_pare, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec  1 04:12:24 np0005540741 systemd[1]: libpod-conmon-a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8.scope: Deactivated successfully.
Dec  1 04:12:24 np0005540741 systemd[1]: Reloading.
Dec  1 04:12:24 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:12:24 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:12:24 np0005540741 systemd[1]: Reloading.
Dec  1 04:12:24 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:12:24 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:12:24 np0005540741 systemd[1]: Starting Ceph mgr.compute-0.psduho for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec  1 04:12:24 np0005540741 podman[75304]: 2025-12-01 09:12:24.987027417 +0000 UTC m=+0.042793758 container create d04e39f9595930c329593b0175fd79c3eaca1ee860ab21f8e9c224dfe8abed9b (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Dec  1 04:12:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77ef07e55d6a56d24c49c75a663dce65c3f6b4cb048e39f91ec5bc9931a540e4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77ef07e55d6a56d24c49c75a663dce65c3f6b4cb048e39f91ec5bc9931a540e4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77ef07e55d6a56d24c49c75a663dce65c3f6b4cb048e39f91ec5bc9931a540e4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77ef07e55d6a56d24c49c75a663dce65c3f6b4cb048e39f91ec5bc9931a540e4/merged/var/lib/ceph/mgr/ceph-compute-0.psduho supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:25 np0005540741 podman[75304]: 2025-12-01 09:12:25.038264794 +0000 UTC m=+0.094031155 container init d04e39f9595930c329593b0175fd79c3eaca1ee860ab21f8e9c224dfe8abed9b (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:12:25 np0005540741 podman[75304]: 2025-12-01 09:12:25.042774642 +0000 UTC m=+0.098540983 container start d04e39f9595930c329593b0175fd79c3eaca1ee860ab21f8e9c224dfe8abed9b (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec  1 04:12:25 np0005540741 bash[75304]: d04e39f9595930c329593b0175fd79c3eaca1ee860ab21f8e9c224dfe8abed9b
Dec  1 04:12:25 np0005540741 podman[75304]: 2025-12-01 09:12:24.968465559 +0000 UTC m=+0.024231920 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:25 np0005540741 systemd[1]: Started Ceph mgr.compute-0.psduho for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec  1 04:12:25 np0005540741 ceph-mgr[75324]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:12:25 np0005540741 ceph-mgr[75324]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Dec  1 04:12:25 np0005540741 ceph-mgr[75324]: pidfile_write: ignore empty --pid-file
Dec  1 04:12:25 np0005540741 podman[75325]: 2025-12-01 09:12:25.122218411 +0000 UTC m=+0.042566432 container create 123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559 (image=quay.io/ceph/ceph:v18, name=festive_sutherland, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:12:25 np0005540741 systemd[1]: Started libpod-conmon-123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559.scope.
Dec  1 04:12:25 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'alerts'
Dec  1 04:12:25 np0005540741 podman[75325]: 2025-12-01 09:12:25.103726165 +0000 UTC m=+0.024074196 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:25 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/338005094f1dbd35386ef3f1269c25328b228dbec05459ea31aa5f86da4f58c0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/338005094f1dbd35386ef3f1269c25328b228dbec05459ea31aa5f86da4f58c0/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/338005094f1dbd35386ef3f1269c25328b228dbec05459ea31aa5f86da4f58c0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:25 np0005540741 podman[75325]: 2025-12-01 09:12:25.223941243 +0000 UTC m=+0.144289254 container init 123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559 (image=quay.io/ceph/ceph:v18, name=festive_sutherland, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default)
Dec  1 04:12:25 np0005540741 podman[75325]: 2025-12-01 09:12:25.230617753 +0000 UTC m=+0.150965764 container start 123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559 (image=quay.io/ceph/ceph:v18, name=festive_sutherland, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:12:25 np0005540741 podman[75325]: 2025-12-01 09:12:25.248713597 +0000 UTC m=+0.169061618 container attach 123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559 (image=quay.io/ceph/ceph:v18, name=festive_sutherland, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec  1 04:12:25 np0005540741 ceph-mgr[75324]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:12:25 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'balancer'
Dec  1 04:12:25 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:25.528+0000 7f54427be140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:12:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec  1 04:12:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1543184186' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]: 
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]: {
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    "health": {
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "status": "HEALTH_OK",
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "checks": {},
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "mutes": []
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    },
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    "election_epoch": 5,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    "quorum": [
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        0
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    ],
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    "quorum_names": [
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "compute-0"
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    ],
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    "quorum_age": 2,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    "monmap": {
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "epoch": 1,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "min_mon_release_name": "reef",
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "num_mons": 1
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    },
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    "osdmap": {
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "epoch": 1,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "num_osds": 0,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "num_up_osds": 0,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "osd_up_since": 0,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "num_in_osds": 0,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "osd_in_since": 0,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "num_remapped_pgs": 0
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    },
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    "pgmap": {
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "pgs_by_state": [],
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "num_pgs": 0,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "num_pools": 0,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "num_objects": 0,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "data_bytes": 0,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "bytes_used": 0,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "bytes_avail": 0,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "bytes_total": 0
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    },
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    "fsmap": {
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "epoch": 1,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "by_rank": [],
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "up:standby": 0
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    },
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    "mgrmap": {
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "available": false,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "num_standbys": 0,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "modules": [
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:            "iostat",
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:            "nfs",
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:            "restful"
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        ],
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "services": {}
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    },
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    "servicemap": {
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "epoch": 1,
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "modified": "2025-12-01T09:12:20.101670+0000",
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:        "services": {}
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    },
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]:    "progress_events": {}
Dec  1 04:12:25 np0005540741 festive_sutherland[75366]: }
Dec  1 04:12:25 np0005540741 systemd[1]: libpod-123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559.scope: Deactivated successfully.
Dec  1 04:12:25 np0005540741 podman[75325]: 2025-12-01 09:12:25.654447564 +0000 UTC m=+0.574795565 container died 123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559 (image=quay.io/ceph/ceph:v18, name=festive_sutherland, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:12:25 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:25.793+0000 7f54427be140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:12:25 np0005540741 ceph-mgr[75324]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:12:25 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'cephadm'
Dec  1 04:12:26 np0005540741 systemd[1]: var-lib-containers-storage-overlay-338005094f1dbd35386ef3f1269c25328b228dbec05459ea31aa5f86da4f58c0-merged.mount: Deactivated successfully.
Dec  1 04:12:26 np0005540741 podman[75325]: 2025-12-01 09:12:26.065142121 +0000 UTC m=+0.985490132 container remove 123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559 (image=quay.io/ceph/ceph:v18, name=festive_sutherland, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2)
Dec  1 04:12:26 np0005540741 systemd[1]: libpod-conmon-123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559.scope: Deactivated successfully.
Dec  1 04:12:27 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'crash'
Dec  1 04:12:28 np0005540741 podman[75416]: 2025-12-01 09:12:28.124541186 +0000 UTC m=+0.037374214 container create 7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192 (image=quay.io/ceph/ceph:v18, name=peaceful_tharp, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:12:28 np0005540741 systemd[1]: Started libpod-conmon-7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192.scope.
Dec  1 04:12:28 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:28 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26e5b27a33154eac81b93ae6cc3dfac7311b4ee17aa1d5afe6229450499899f4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:28 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26e5b27a33154eac81b93ae6cc3dfac7311b4ee17aa1d5afe6229450499899f4/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:28 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26e5b27a33154eac81b93ae6cc3dfac7311b4ee17aa1d5afe6229450499899f4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:28 np0005540741 ceph-mgr[75324]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:12:28 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:28.175+0000 7f54427be140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:12:28 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'dashboard'
Dec  1 04:12:28 np0005540741 podman[75416]: 2025-12-01 09:12:28.190651296 +0000 UTC m=+0.103484324 container init 7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192 (image=quay.io/ceph/ceph:v18, name=peaceful_tharp, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec  1 04:12:28 np0005540741 podman[75416]: 2025-12-01 09:12:28.198492379 +0000 UTC m=+0.111325407 container start 7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192 (image=quay.io/ceph/ceph:v18, name=peaceful_tharp, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Dec  1 04:12:28 np0005540741 podman[75416]: 2025-12-01 09:12:28.202348678 +0000 UTC m=+0.115181726 container attach 7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192 (image=quay.io/ceph/ceph:v18, name=peaceful_tharp, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:12:28 np0005540741 podman[75416]: 2025-12-01 09:12:28.107060849 +0000 UTC m=+0.019893897 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec  1 04:12:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4027611761' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]: 
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]: {
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    "health": {
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "status": "HEALTH_OK",
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "checks": {},
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "mutes": []
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    },
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    "election_epoch": 5,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    "quorum": [
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        0
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    ],
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    "quorum_names": [
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "compute-0"
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    ],
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    "quorum_age": 5,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    "monmap": {
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "epoch": 1,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "min_mon_release_name": "reef",
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "num_mons": 1
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    },
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    "osdmap": {
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "epoch": 1,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "num_osds": 0,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "num_up_osds": 0,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "osd_up_since": 0,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "num_in_osds": 0,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "osd_in_since": 0,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "num_remapped_pgs": 0
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    },
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    "pgmap": {
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "pgs_by_state": [],
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "num_pgs": 0,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "num_pools": 0,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "num_objects": 0,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "data_bytes": 0,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "bytes_used": 0,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "bytes_avail": 0,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "bytes_total": 0
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    },
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    "fsmap": {
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "epoch": 1,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "by_rank": [],
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "up:standby": 0
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    },
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    "mgrmap": {
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "available": false,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "num_standbys": 0,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "modules": [
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:            "iostat",
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:            "nfs",
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:            "restful"
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        ],
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "services": {}
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    },
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    "servicemap": {
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "epoch": 1,
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "modified": "2025-12-01T09:12:20.101670+0000",
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:        "services": {}
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    },
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]:    "progress_events": {}
Dec  1 04:12:28 np0005540741 peaceful_tharp[75432]: }
Dec  1 04:12:28 np0005540741 systemd[1]: libpod-7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192.scope: Deactivated successfully.
Dec  1 04:12:28 np0005540741 podman[75459]: 2025-12-01 09:12:28.635769802 +0000 UTC m=+0.023879700 container died 7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192 (image=quay.io/ceph/ceph:v18, name=peaceful_tharp, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec  1 04:12:28 np0005540741 systemd[1]: var-lib-containers-storage-overlay-26e5b27a33154eac81b93ae6cc3dfac7311b4ee17aa1d5afe6229450499899f4-merged.mount: Deactivated successfully.
Dec  1 04:12:28 np0005540741 podman[75459]: 2025-12-01 09:12:28.732018339 +0000 UTC m=+0.120128217 container remove 7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192 (image=quay.io/ceph/ceph:v18, name=peaceful_tharp, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True)
Dec  1 04:12:28 np0005540741 systemd[1]: libpod-conmon-7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192.scope: Deactivated successfully.
Dec  1 04:12:29 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'devicehealth'
Dec  1 04:12:29 np0005540741 ceph-mgr[75324]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:12:29 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:29.987+0000 7f54427be140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:12:29 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'diskprediction_local'
Dec  1 04:12:30 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  1 04:12:30 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  1 04:12:30 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]:  from numpy import show_config as show_numpy_config
Dec  1 04:12:30 np0005540741 ceph-mgr[75324]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:12:30 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:30.603+0000 7f54427be140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:12:30 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'influx'
Dec  1 04:12:30 np0005540741 podman[75473]: 2025-12-01 09:12:30.849450813 +0000 UTC m=+0.090484374 container create 3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66 (image=quay.io/ceph/ceph:v18, name=gallant_wing, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec  1 04:12:30 np0005540741 podman[75473]: 2025-12-01 09:12:30.783749175 +0000 UTC m=+0.024782776 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:30 np0005540741 ceph-mgr[75324]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:12:30 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:30.884+0000 7f54427be140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:12:30 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'insights'
Dec  1 04:12:30 np0005540741 systemd[1]: Started libpod-conmon-3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66.scope.
Dec  1 04:12:30 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6648b0356c9786bed952e3c7ab67904cf4c27f0d2b7c4d6f02f82294e3f23f2e/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6648b0356c9786bed952e3c7ab67904cf4c27f0d2b7c4d6f02f82294e3f23f2e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6648b0356c9786bed952e3c7ab67904cf4c27f0d2b7c4d6f02f82294e3f23f2e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:30 np0005540741 podman[75473]: 2025-12-01 09:12:30.936732864 +0000 UTC m=+0.177766425 container init 3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66 (image=quay.io/ceph/ceph:v18, name=gallant_wing, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec  1 04:12:30 np0005540741 podman[75473]: 2025-12-01 09:12:30.941284604 +0000 UTC m=+0.182318145 container start 3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66 (image=quay.io/ceph/ceph:v18, name=gallant_wing, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:12:30 np0005540741 podman[75473]: 2025-12-01 09:12:30.944782573 +0000 UTC m=+0.185816124 container attach 3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66 (image=quay.io/ceph/ceph:v18, name=gallant_wing, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:12:31 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'iostat'
Dec  1 04:12:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec  1 04:12:31 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3742582137' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec  1 04:12:31 np0005540741 gallant_wing[75489]: 
Dec  1 04:12:31 np0005540741 gallant_wing[75489]: {
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    "health": {
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "status": "HEALTH_OK",
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "checks": {},
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "mutes": []
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    },
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    "election_epoch": 5,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    "quorum": [
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        0
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    ],
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    "quorum_names": [
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "compute-0"
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    ],
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    "quorum_age": 8,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    "monmap": {
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "epoch": 1,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "min_mon_release_name": "reef",
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "num_mons": 1
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    },
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    "osdmap": {
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "epoch": 1,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "num_osds": 0,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "num_up_osds": 0,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "osd_up_since": 0,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "num_in_osds": 0,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "osd_in_since": 0,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "num_remapped_pgs": 0
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    },
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    "pgmap": {
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "pgs_by_state": [],
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "num_pgs": 0,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "num_pools": 0,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "num_objects": 0,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "data_bytes": 0,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "bytes_used": 0,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "bytes_avail": 0,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "bytes_total": 0
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    },
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    "fsmap": {
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "epoch": 1,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "by_rank": [],
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "up:standby": 0
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    },
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    "mgrmap": {
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "available": false,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "num_standbys": 0,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "modules": [
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:            "iostat",
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:            "nfs",
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:            "restful"
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        ],
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "services": {}
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    },
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    "servicemap": {
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "epoch": 1,
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "modified": "2025-12-01T09:12:20.101670+0000",
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:        "services": {}
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    },
Dec  1 04:12:31 np0005540741 gallant_wing[75489]:    "progress_events": {}
Dec  1 04:12:31 np0005540741 gallant_wing[75489]: }
Dec  1 04:12:31 np0005540741 systemd[1]: libpod-3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66.scope: Deactivated successfully.
Dec  1 04:12:31 np0005540741 podman[75473]: 2025-12-01 09:12:31.361751709 +0000 UTC m=+0.602785270 container died 3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66 (image=quay.io/ceph/ceph:v18, name=gallant_wing, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec  1 04:12:31 np0005540741 systemd[1]: var-lib-containers-storage-overlay-6648b0356c9786bed952e3c7ab67904cf4c27f0d2b7c4d6f02f82294e3f23f2e-merged.mount: Deactivated successfully.
Dec  1 04:12:31 np0005540741 ceph-mgr[75324]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:12:31 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'k8sevents'
Dec  1 04:12:31 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:31.413+0000 7f54427be140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:12:31 np0005540741 podman[75473]: 2025-12-01 09:12:31.465852629 +0000 UTC m=+0.706886180 container remove 3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66 (image=quay.io/ceph/ceph:v18, name=gallant_wing, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:12:31 np0005540741 systemd[1]: libpod-conmon-3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66.scope: Deactivated successfully.
Dec  1 04:12:33 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'localpool'
Dec  1 04:12:33 np0005540741 podman[75527]: 2025-12-01 09:12:33.523058951 +0000 UTC m=+0.036034836 container create 4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d (image=quay.io/ceph/ceph:v18, name=vibrant_rosalind, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  1 04:12:33 np0005540741 systemd[1]: Started libpod-conmon-4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d.scope.
Dec  1 04:12:33 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:33 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edd2dbd1264b0a0daca534acc396b9653e0248eda93f1ff39cc7399d9207f0a8/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:33 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edd2dbd1264b0a0daca534acc396b9653e0248eda93f1ff39cc7399d9207f0a8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:33 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edd2dbd1264b0a0daca534acc396b9653e0248eda93f1ff39cc7399d9207f0a8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:33 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'mds_autoscaler'
Dec  1 04:12:33 np0005540741 podman[75527]: 2025-12-01 09:12:33.507831798 +0000 UTC m=+0.020807703 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:33 np0005540741 podman[75527]: 2025-12-01 09:12:33.61551737 +0000 UTC m=+0.128493275 container init 4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d (image=quay.io/ceph/ceph:v18, name=vibrant_rosalind, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec  1 04:12:33 np0005540741 podman[75527]: 2025-12-01 09:12:33.620250204 +0000 UTC m=+0.133226089 container start 4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d (image=quay.io/ceph/ceph:v18, name=vibrant_rosalind, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec  1 04:12:33 np0005540741 podman[75527]: 2025-12-01 09:12:33.623484156 +0000 UTC m=+0.136460041 container attach 4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d (image=quay.io/ceph/ceph:v18, name=vibrant_rosalind, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec  1 04:12:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec  1 04:12:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/960206289' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]: 
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]: {
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    "health": {
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "status": "HEALTH_OK",
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "checks": {},
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "mutes": []
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    },
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    "election_epoch": 5,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    "quorum": [
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        0
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    ],
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    "quorum_names": [
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "compute-0"
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    ],
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    "quorum_age": 11,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    "monmap": {
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "epoch": 1,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "min_mon_release_name": "reef",
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "num_mons": 1
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    },
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    "osdmap": {
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "epoch": 1,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "num_osds": 0,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "num_up_osds": 0,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "osd_up_since": 0,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "num_in_osds": 0,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "osd_in_since": 0,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "num_remapped_pgs": 0
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    },
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    "pgmap": {
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "pgs_by_state": [],
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "num_pgs": 0,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "num_pools": 0,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "num_objects": 0,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "data_bytes": 0,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "bytes_used": 0,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "bytes_avail": 0,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "bytes_total": 0
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    },
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    "fsmap": {
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "epoch": 1,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "by_rank": [],
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "up:standby": 0
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    },
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    "mgrmap": {
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "available": false,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "num_standbys": 0,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "modules": [
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:            "iostat",
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:            "nfs",
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:            "restful"
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        ],
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "services": {}
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    },
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    "servicemap": {
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "epoch": 1,
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "modified": "2025-12-01T09:12:20.101670+0000",
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:        "services": {}
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    },
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]:    "progress_events": {}
Dec  1 04:12:34 np0005540741 vibrant_rosalind[75544]: }
Dec  1 04:12:34 np0005540741 systemd[1]: libpod-4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d.scope: Deactivated successfully.
Dec  1 04:12:34 np0005540741 podman[75527]: 2025-12-01 09:12:34.044509598 +0000 UTC m=+0.557485483 container died 4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d (image=quay.io/ceph/ceph:v18, name=vibrant_rosalind, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:12:34 np0005540741 systemd[1]: var-lib-containers-storage-overlay-edd2dbd1264b0a0daca534acc396b9653e0248eda93f1ff39cc7399d9207f0a8-merged.mount: Deactivated successfully.
Dec  1 04:12:34 np0005540741 podman[75527]: 2025-12-01 09:12:34.086505462 +0000 UTC m=+0.599481347 container remove 4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d (image=quay.io/ceph/ceph:v18, name=vibrant_rosalind, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec  1 04:12:34 np0005540741 systemd[1]: libpod-conmon-4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d.scope: Deactivated successfully.
Dec  1 04:12:34 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'mirroring'
Dec  1 04:12:34 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'nfs'
Dec  1 04:12:35 np0005540741 ceph-mgr[75324]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:12:35 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:35.409+0000 7f54427be140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:12:35 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'orchestrator'
Dec  1 04:12:36 np0005540741 ceph-mgr[75324]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:12:36 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:36.132+0000 7f54427be140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:12:36 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'osd_perf_query'
Dec  1 04:12:36 np0005540741 podman[75583]: 2025-12-01 09:12:36.149680295 +0000 UTC m=+0.042124059 container create a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170 (image=quay.io/ceph/ceph:v18, name=charming_elbakyan, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default)
Dec  1 04:12:36 np0005540741 systemd[1]: Started libpod-conmon-a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170.scope.
Dec  1 04:12:36 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f336aba3ff6ab595c6e66e85c5dbe7758c9dce32dfb6f9e6990178971a118afe/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f336aba3ff6ab595c6e66e85c5dbe7758c9dce32dfb6f9e6990178971a118afe/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f336aba3ff6ab595c6e66e85c5dbe7758c9dce32dfb6f9e6990178971a118afe/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:36 np0005540741 podman[75583]: 2025-12-01 09:12:36.207219121 +0000 UTC m=+0.099662905 container init a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170 (image=quay.io/ceph/ceph:v18, name=charming_elbakyan, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:12:36 np0005540741 podman[75583]: 2025-12-01 09:12:36.213046346 +0000 UTC m=+0.105490110 container start a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170 (image=quay.io/ceph/ceph:v18, name=charming_elbakyan, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:12:36 np0005540741 podman[75583]: 2025-12-01 09:12:36.216020881 +0000 UTC m=+0.108464695 container attach a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170 (image=quay.io/ceph/ceph:v18, name=charming_elbakyan, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  1 04:12:36 np0005540741 podman[75583]: 2025-12-01 09:12:36.130050106 +0000 UTC m=+0.022493870 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:36 np0005540741 ceph-mgr[75324]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:12:36 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'osd_support'
Dec  1 04:12:36 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:36.424+0000 7f54427be140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:12:36 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec  1 04:12:36 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4191684427' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]: 
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]: {
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    "health": {
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "status": "HEALTH_OK",
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "checks": {},
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "mutes": []
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    },
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    "election_epoch": 5,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    "quorum": [
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        0
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    ],
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    "quorum_names": [
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "compute-0"
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    ],
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    "quorum_age": 13,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    "monmap": {
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "epoch": 1,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "min_mon_release_name": "reef",
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "num_mons": 1
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    },
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    "osdmap": {
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "epoch": 1,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "num_osds": 0,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "num_up_osds": 0,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "osd_up_since": 0,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "num_in_osds": 0,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "osd_in_since": 0,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "num_remapped_pgs": 0
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    },
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    "pgmap": {
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "pgs_by_state": [],
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "num_pgs": 0,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "num_pools": 0,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "num_objects": 0,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "data_bytes": 0,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "bytes_used": 0,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "bytes_avail": 0,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "bytes_total": 0
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    },
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    "fsmap": {
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "epoch": 1,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "by_rank": [],
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "up:standby": 0
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    },
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    "mgrmap": {
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "available": false,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "num_standbys": 0,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "modules": [
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:            "iostat",
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:            "nfs",
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:            "restful"
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        ],
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "services": {}
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    },
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    "servicemap": {
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "epoch": 1,
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "modified": "2025-12-01T09:12:20.101670+0000",
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:        "services": {}
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    },
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]:    "progress_events": {}
Dec  1 04:12:36 np0005540741 charming_elbakyan[75599]: }
Dec  1 04:12:36 np0005540741 systemd[1]: libpod-a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170.scope: Deactivated successfully.
Dec  1 04:12:36 np0005540741 podman[75583]: 2025-12-01 09:12:36.623202688 +0000 UTC m=+0.515646462 container died a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170 (image=quay.io/ceph/ceph:v18, name=charming_elbakyan, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True)
Dec  1 04:12:36 np0005540741 systemd[1]: var-lib-containers-storage-overlay-f336aba3ff6ab595c6e66e85c5dbe7758c9dce32dfb6f9e6990178971a118afe-merged.mount: Deactivated successfully.
Dec  1 04:12:36 np0005540741 ceph-mgr[75324]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:12:36 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'pg_autoscaler'
Dec  1 04:12:36 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:36.697+0000 7f54427be140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:12:36 np0005540741 podman[75583]: 2025-12-01 09:12:36.713678511 +0000 UTC m=+0.606122285 container remove a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170 (image=quay.io/ceph/ceph:v18, name=charming_elbakyan, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 04:12:36 np0005540741 systemd[1]: libpod-conmon-a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170.scope: Deactivated successfully.
Dec  1 04:12:37 np0005540741 ceph-mgr[75324]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:12:37 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'progress'
Dec  1 04:12:37 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:37.007+0000 7f54427be140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:12:37 np0005540741 ceph-mgr[75324]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:12:37 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'prometheus'
Dec  1 04:12:37 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:37.271+0000 7f54427be140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:12:38 np0005540741 ceph-mgr[75324]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:12:38 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:38.350+0000 7f54427be140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:12:38 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'rbd_support'
Dec  1 04:12:38 np0005540741 ceph-mgr[75324]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:12:38 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'restful'
Dec  1 04:12:38 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:38.691+0000 7f54427be140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:12:38 np0005540741 podman[75639]: 2025-12-01 09:12:38.81720805 +0000 UTC m=+0.076306300 container create 679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045 (image=quay.io/ceph/ceph:v18, name=ecstatic_wu, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507)
Dec  1 04:12:38 np0005540741 systemd[1]: Started libpod-conmon-679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045.scope.
Dec  1 04:12:38 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f5af3dd0bb1be36ce30127e19f51b77b5caa37cc0fa73d679faa47789ad1c34/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f5af3dd0bb1be36ce30127e19f51b77b5caa37cc0fa73d679faa47789ad1c34/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f5af3dd0bb1be36ce30127e19f51b77b5caa37cc0fa73d679faa47789ad1c34/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:38 np0005540741 podman[75639]: 2025-12-01 09:12:38.770987766 +0000 UTC m=+0.030086036 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:38 np0005540741 podman[75639]: 2025-12-01 09:12:38.882423025 +0000 UTC m=+0.141521295 container init 679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045 (image=quay.io/ceph/ceph:v18, name=ecstatic_wu, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:12:38 np0005540741 podman[75639]: 2025-12-01 09:12:38.887479768 +0000 UTC m=+0.146578018 container start 679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045 (image=quay.io/ceph/ceph:v18, name=ecstatic_wu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef)
Dec  1 04:12:38 np0005540741 podman[75639]: 2025-12-01 09:12:38.890664759 +0000 UTC m=+0.149763029 container attach 679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045 (image=quay.io/ceph/ceph:v18, name=ecstatic_wu, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:12:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec  1 04:12:39 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2405312232' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]: 
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]: {
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    "health": {
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "status": "HEALTH_OK",
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "checks": {},
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "mutes": []
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    },
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    "election_epoch": 5,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    "quorum": [
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        0
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    ],
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    "quorum_names": [
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "compute-0"
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    ],
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    "quorum_age": 16,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    "monmap": {
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "epoch": 1,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "min_mon_release_name": "reef",
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "num_mons": 1
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    },
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    "osdmap": {
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "epoch": 1,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "num_osds": 0,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "num_up_osds": 0,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "osd_up_since": 0,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "num_in_osds": 0,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "osd_in_since": 0,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "num_remapped_pgs": 0
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    },
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    "pgmap": {
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "pgs_by_state": [],
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "num_pgs": 0,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "num_pools": 0,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "num_objects": 0,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "data_bytes": 0,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "bytes_used": 0,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "bytes_avail": 0,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "bytes_total": 0
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    },
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    "fsmap": {
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "epoch": 1,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "by_rank": [],
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "up:standby": 0
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    },
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    "mgrmap": {
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "available": false,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "num_standbys": 0,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "modules": [
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:            "iostat",
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:            "nfs",
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:            "restful"
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        ],
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "services": {}
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    },
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    "servicemap": {
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "epoch": 1,
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "modified": "2025-12-01T09:12:20.101670+0000",
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:        "services": {}
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    },
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]:    "progress_events": {}
Dec  1 04:12:39 np0005540741 ecstatic_wu[75655]: }
Dec  1 04:12:39 np0005540741 systemd[1]: libpod-679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045.scope: Deactivated successfully.
Dec  1 04:12:39 np0005540741 podman[75681]: 2025-12-01 09:12:39.359213901 +0000 UTC m=+0.023289393 container died 679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045 (image=quay.io/ceph/ceph:v18, name=ecstatic_wu, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:12:39 np0005540741 systemd[1]: var-lib-containers-storage-overlay-1f5af3dd0bb1be36ce30127e19f51b77b5caa37cc0fa73d679faa47789ad1c34-merged.mount: Deactivated successfully.
Dec  1 04:12:39 np0005540741 podman[75681]: 2025-12-01 09:12:39.407935097 +0000 UTC m=+0.072010569 container remove 679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045 (image=quay.io/ceph/ceph:v18, name=ecstatic_wu, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Dec  1 04:12:39 np0005540741 systemd[1]: libpod-conmon-679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045.scope: Deactivated successfully.
Dec  1 04:12:39 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'rgw'
Dec  1 04:12:40 np0005540741 ceph-mgr[75324]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:12:40 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'rook'
Dec  1 04:12:40 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:40.209+0000 7f54427be140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:12:41 np0005540741 podman[75697]: 2025-12-01 09:12:41.471945112 +0000 UTC m=+0.038529566 container create f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631 (image=quay.io/ceph/ceph:v18, name=optimistic_ellis, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec  1 04:12:41 np0005540741 systemd[1]: Started libpod-conmon-f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631.scope.
Dec  1 04:12:41 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:41 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ba195f68fc936af12720c541d59f1025ce9c83c7321599a69c053913c2b9bc0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:41 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ba195f68fc936af12720c541d59f1025ce9c83c7321599a69c053913c2b9bc0/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:41 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ba195f68fc936af12720c541d59f1025ce9c83c7321599a69c053913c2b9bc0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:41 np0005540741 podman[75697]: 2025-12-01 09:12:41.455706331 +0000 UTC m=+0.022290815 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:41 np0005540741 podman[75697]: 2025-12-01 09:12:41.562085505 +0000 UTC m=+0.128669999 container init f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631 (image=quay.io/ceph/ceph:v18, name=optimistic_ellis, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:12:41 np0005540741 podman[75697]: 2025-12-01 09:12:41.567699565 +0000 UTC m=+0.134284029 container start f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631 (image=quay.io/ceph/ceph:v18, name=optimistic_ellis, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Dec  1 04:12:41 np0005540741 podman[75697]: 2025-12-01 09:12:41.570998549 +0000 UTC m=+0.137583053 container attach f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631 (image=quay.io/ceph/ceph:v18, name=optimistic_ellis, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  1 04:12:41 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec  1 04:12:41 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1192011265' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]: 
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]: {
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    "health": {
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "status": "HEALTH_OK",
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "checks": {},
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "mutes": []
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    },
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    "election_epoch": 5,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    "quorum": [
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        0
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    ],
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    "quorum_names": [
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "compute-0"
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    ],
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    "quorum_age": 19,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    "monmap": {
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "epoch": 1,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "min_mon_release_name": "reef",
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "num_mons": 1
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    },
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    "osdmap": {
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "epoch": 1,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "num_osds": 0,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "num_up_osds": 0,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "osd_up_since": 0,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "num_in_osds": 0,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "osd_in_since": 0,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "num_remapped_pgs": 0
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    },
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    "pgmap": {
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "pgs_by_state": [],
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "num_pgs": 0,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "num_pools": 0,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "num_objects": 0,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "data_bytes": 0,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "bytes_used": 0,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "bytes_avail": 0,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "bytes_total": 0
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    },
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    "fsmap": {
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "epoch": 1,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "by_rank": [],
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "up:standby": 0
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    },
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    "mgrmap": {
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "available": false,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "num_standbys": 0,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "modules": [
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:            "iostat",
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:            "nfs",
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:            "restful"
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        ],
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "services": {}
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    },
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    "servicemap": {
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "epoch": 1,
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "modified": "2025-12-01T09:12:20.101670+0000",
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:        "services": {}
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    },
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]:    "progress_events": {}
Dec  1 04:12:41 np0005540741 optimistic_ellis[75713]: }
Dec  1 04:12:41 np0005540741 systemd[1]: libpod-f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631.scope: Deactivated successfully.
Dec  1 04:12:41 np0005540741 podman[75697]: 2025-12-01 09:12:41.989011044 +0000 UTC m=+0.555595508 container died f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631 (image=quay.io/ceph/ceph:v18, name=optimistic_ellis, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:12:42 np0005540741 systemd[1]: var-lib-containers-storage-overlay-9ba195f68fc936af12720c541d59f1025ce9c83c7321599a69c053913c2b9bc0-merged.mount: Deactivated successfully.
Dec  1 04:12:42 np0005540741 podman[75697]: 2025-12-01 09:12:42.029267959 +0000 UTC m=+0.595852423 container remove f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631 (image=quay.io/ceph/ceph:v18, name=optimistic_ellis, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:12:42 np0005540741 systemd[1]: libpod-conmon-f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631.scope: Deactivated successfully.
Dec  1 04:12:42 np0005540741 ceph-mgr[75324]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:12:42 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'selftest'
Dec  1 04:12:42 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:42.480+0000 7f54427be140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:12:42 np0005540741 ceph-mgr[75324]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:12:42 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'snap_schedule'
Dec  1 04:12:42 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:42.745+0000 7f54427be140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:12:43 np0005540741 ceph-mgr[75324]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:12:43 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'stats'
Dec  1 04:12:43 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:43.017+0000 7f54427be140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:12:43 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'status'
Dec  1 04:12:43 np0005540741 ceph-mgr[75324]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:12:43 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'telegraf'
Dec  1 04:12:43 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:43.585+0000 7f54427be140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:12:43 np0005540741 ceph-mgr[75324]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:12:43 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'telemetry'
Dec  1 04:12:43 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:43.835+0000 7f54427be140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:12:44 np0005540741 podman[75755]: 2025-12-01 09:12:44.072610428 +0000 UTC m=+0.021644757 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:44 np0005540741 ceph-mgr[75324]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:12:44 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'test_orchestrator'
Dec  1 04:12:44 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:44.523+0000 7f54427be140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:12:45 np0005540741 ceph-mgr[75324]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:12:45 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:45.269+0000 7f54427be140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:12:45 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'volumes'
Dec  1 04:12:45 np0005540741 podman[75755]: 2025-12-01 09:12:45.706565766 +0000 UTC m=+1.655600065 container create 7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4 (image=quay.io/ceph/ceph:v18, name=jovial_lamport, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:12:45 np0005540741 systemd[1]: Started libpod-conmon-7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4.scope.
Dec  1 04:12:45 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:45 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c006491637e3f311018b7732dd558850085fa1f9a3c32149b01a12087fce55d/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:45 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c006491637e3f311018b7732dd558850085fa1f9a3c32149b01a12087fce55d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:45 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c006491637e3f311018b7732dd558850085fa1f9a3c32149b01a12087fce55d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:45 np0005540741 podman[75755]: 2025-12-01 09:12:45.830377476 +0000 UTC m=+1.779411795 container init 7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4 (image=quay.io/ceph/ceph:v18, name=jovial_lamport, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:12:45 np0005540741 podman[75755]: 2025-12-01 09:12:45.83756796 +0000 UTC m=+1.786602259 container start 7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4 (image=quay.io/ceph/ceph:v18, name=jovial_lamport, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec  1 04:12:45 np0005540741 podman[75755]: 2025-12-01 09:12:45.841021029 +0000 UTC m=+1.790055358 container attach 7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4 (image=quay.io/ceph/ceph:v18, name=jovial_lamport, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'zabbix'
Dec  1 04:12:46 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:46.125+0000 7f54427be140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3735729918' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]: 
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]: {
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    "health": {
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "status": "HEALTH_OK",
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "checks": {},
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "mutes": []
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    },
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    "election_epoch": 5,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    "quorum": [
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        0
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    ],
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    "quorum_names": [
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "compute-0"
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    ],
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    "quorum_age": 23,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    "monmap": {
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "epoch": 1,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "min_mon_release_name": "reef",
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "num_mons": 1
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    },
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    "osdmap": {
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "epoch": 1,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "num_osds": 0,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "num_up_osds": 0,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "osd_up_since": 0,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "num_in_osds": 0,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "osd_in_since": 0,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "num_remapped_pgs": 0
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    },
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    "pgmap": {
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "pgs_by_state": [],
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "num_pgs": 0,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "num_pools": 0,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "num_objects": 0,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "data_bytes": 0,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "bytes_used": 0,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "bytes_avail": 0,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "bytes_total": 0
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    },
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    "fsmap": {
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "epoch": 1,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "by_rank": [],
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "up:standby": 0
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    },
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    "mgrmap": {
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "available": false,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "num_standbys": 0,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "modules": [
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:            "iostat",
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:            "nfs",
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:            "restful"
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        ],
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "services": {}
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    },
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    "servicemap": {
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "epoch": 1,
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "modified": "2025-12-01T09:12:20.101670+0000",
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:        "services": {}
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    },
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]:    "progress_events": {}
Dec  1 04:12:46 np0005540741 jovial_lamport[75771]: }
Dec  1 04:12:46 np0005540741 systemd[1]: libpod-7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4.scope: Deactivated successfully.
Dec  1 04:12:46 np0005540741 podman[75797]: 2025-12-01 09:12:46.388262458 +0000 UTC m=+0.053830631 container died 7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4 (image=quay.io/ceph/ceph:v18, name=jovial_lamport, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef)
Dec  1 04:12:46 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:46.405+0000 7f54427be140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: ms_deliver_dispatch: unhandled message 0x55f9a53f31e0 mon_map magic: 0 v1 from mon.0 v2:192.168.122.100:3300/0
Dec  1 04:12:46 np0005540741 systemd[1]: var-lib-containers-storage-overlay-7c006491637e3f311018b7732dd558850085fa1f9a3c32149b01a12087fce55d-merged.mount: Deactivated successfully.
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : Activating manager daemon compute-0.psduho
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr handle_mgr_map Activating!
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr handle_mgr_map I am now activating
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e2: compute-0.psduho(active, starting, since 0.00914665s)
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mds metadata"} v 0) v1
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mds metadata"}]: dispatch
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).mds e1 all = 1
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon metadata"} v 0) v1
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mon metadata"}]: dispatch
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon metadata", "id": "compute-0"} v 0) v1
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata", "who": "compute-0.psduho", "id": "compute-0.psduho"} v 0) v1
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mgr metadata", "who": "compute-0.psduho", "id": "compute-0.psduho"}]: dispatch
Dec  1 04:12:46 np0005540741 podman[75797]: 2025-12-01 09:12:46.429070199 +0000 UTC m=+0.094638372 container remove 7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4 (image=quay.io/ceph/ceph:v18, name=jovial_lamport, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: balancer
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [balancer INFO root] Starting
Dec  1 04:12:46 np0005540741 systemd[1]: libpod-conmon-7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4.scope: Deactivated successfully.
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: crash
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : Manager daemon compute-0.psduho is now available
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: devicehealth
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [devicehealth INFO root] Starting
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: iostat
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: nfs
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: orchestrator
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:12:46
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [balancer INFO root] No pools available
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: pg_autoscaler
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: progress
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [progress INFO root] Loading...
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [progress INFO root] No stored events to load
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [progress INFO root] Loaded [] historic events
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [progress INFO root] Loaded OSDMap, ready.
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] recovery thread starting
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] starting setup
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: rbd_support
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: restful
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [restful INFO root] server_addr: :: server_port: 8003
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [restful WARNING root] server not running: no certificate configured
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: status
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/mirror_snapshot_schedule"} v 0) v1
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/mirror_snapshot_schedule"}]: dispatch
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: telemetry
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/telemetry/report_id}] v 0) v1
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] PerfHandler: starting
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TaskHandler: starting
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/trash_purge_schedule"} v 0) v1
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/trash_purge_schedule"}]: dispatch
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' 
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/telemetry/salt}] v 0) v1
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] setup complete
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' 
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/telemetry/collection}] v 0) v1
Dec  1 04:12:46 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: volumes
Dec  1 04:12:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' 
Dec  1 04:12:47 np0005540741 ceph-mon[75031]: Activating manager daemon compute-0.psduho
Dec  1 04:12:47 np0005540741 ceph-mon[75031]: Manager daemon compute-0.psduho is now available
Dec  1 04:12:47 np0005540741 ceph-mon[75031]: from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/mirror_snapshot_schedule"}]: dispatch
Dec  1 04:12:47 np0005540741 ceph-mon[75031]: from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/trash_purge_schedule"}]: dispatch
Dec  1 04:12:47 np0005540741 ceph-mon[75031]: from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' 
Dec  1 04:12:47 np0005540741 ceph-mon[75031]: from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' 
Dec  1 04:12:47 np0005540741 ceph-mon[75031]: from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' 
Dec  1 04:12:47 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e3: compute-0.psduho(active, since 1.20235s)
Dec  1 04:12:48 np0005540741 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec  1 04:12:48 np0005540741 podman[75892]: 2025-12-01 09:12:48.504668414 +0000 UTC m=+0.047227924 container create ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2 (image=quay.io/ceph/ceph:v18, name=exciting_faraday, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec  1 04:12:48 np0005540741 systemd[1]: Started libpod-conmon-ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2.scope.
Dec  1 04:12:48 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef30f1925195992ef33c598d9f0ccf4d7f97c9de892f94d3d7741733a691b8c6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef30f1925195992ef33c598d9f0ccf4d7f97c9de892f94d3d7741733a691b8c6/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef30f1925195992ef33c598d9f0ccf4d7f97c9de892f94d3d7741733a691b8c6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:48 np0005540741 podman[75892]: 2025-12-01 09:12:48.574004466 +0000 UTC m=+0.116563976 container init ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2 (image=quay.io/ceph/ceph:v18, name=exciting_faraday, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec  1 04:12:48 np0005540741 podman[75892]: 2025-12-01 09:12:48.480786135 +0000 UTC m=+0.023345675 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:48 np0005540741 podman[75892]: 2025-12-01 09:12:48.584630258 +0000 UTC m=+0.127189788 container start ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2 (image=quay.io/ceph/ceph:v18, name=exciting_faraday, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec  1 04:12:48 np0005540741 podman[75892]: 2025-12-01 09:12:48.588700563 +0000 UTC m=+0.131260113 container attach ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2 (image=quay.io/ceph/ceph:v18, name=exciting_faraday, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:12:48 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e4: compute-0.psduho(active, since 2s)
Dec  1 04:12:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec  1 04:12:49 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2366778493' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]: 
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]: {
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    "health": {
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "status": "HEALTH_OK",
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "checks": {},
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "mutes": []
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    },
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    "election_epoch": 5,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    "quorum": [
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        0
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    ],
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    "quorum_names": [
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "compute-0"
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    ],
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    "quorum_age": 26,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    "monmap": {
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "epoch": 1,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "min_mon_release_name": "reef",
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "num_mons": 1
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    },
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    "osdmap": {
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "epoch": 1,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "num_osds": 0,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "num_up_osds": 0,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "osd_up_since": 0,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "num_in_osds": 0,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "osd_in_since": 0,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "num_remapped_pgs": 0
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    },
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    "pgmap": {
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "pgs_by_state": [],
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "num_pgs": 0,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "num_pools": 0,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "num_objects": 0,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "data_bytes": 0,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "bytes_used": 0,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "bytes_avail": 0,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "bytes_total": 0
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    },
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    "fsmap": {
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "epoch": 1,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "by_rank": [],
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "up:standby": 0
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    },
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    "mgrmap": {
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "available": true,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "num_standbys": 0,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "modules": [
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:            "iostat",
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:            "nfs",
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:            "restful"
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        ],
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "services": {}
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    },
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    "servicemap": {
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "epoch": 1,
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "modified": "2025-12-01T09:12:20.101670+0000",
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:        "services": {}
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    },
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]:    "progress_events": {}
Dec  1 04:12:49 np0005540741 exciting_faraday[75908]: }
Dec  1 04:12:49 np0005540741 systemd[1]: libpod-ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2.scope: Deactivated successfully.
Dec  1 04:12:49 np0005540741 podman[75892]: 2025-12-01 09:12:49.190537766 +0000 UTC m=+0.733097266 container died ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2 (image=quay.io/ceph/ceph:v18, name=exciting_faraday, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec  1 04:12:49 np0005540741 systemd[1]: var-lib-containers-storage-overlay-ef30f1925195992ef33c598d9f0ccf4d7f97c9de892f94d3d7741733a691b8c6-merged.mount: Deactivated successfully.
Dec  1 04:12:49 np0005540741 podman[75892]: 2025-12-01 09:12:49.280896895 +0000 UTC m=+0.823456405 container remove ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2 (image=quay.io/ceph/ceph:v18, name=exciting_faraday, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:12:49 np0005540741 systemd[1]: libpod-conmon-ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2.scope: Deactivated successfully.
Dec  1 04:12:49 np0005540741 podman[75946]: 2025-12-01 09:12:49.347632942 +0000 UTC m=+0.041502301 container create 4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2 (image=quay.io/ceph/ceph:v18, name=crazy_dijkstra, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec  1 04:12:49 np0005540741 systemd[1]: Started libpod-conmon-4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2.scope.
Dec  1 04:12:49 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:49 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5a6b58ab70cd89a09667bf796c174534b2436d214e6e38a0d0edb79b9ece471/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:49 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5a6b58ab70cd89a09667bf796c174534b2436d214e6e38a0d0edb79b9ece471/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:49 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5a6b58ab70cd89a09667bf796c174534b2436d214e6e38a0d0edb79b9ece471/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:49 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5a6b58ab70cd89a09667bf796c174534b2436d214e6e38a0d0edb79b9ece471/merged/var/lib/ceph/user.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:49 np0005540741 podman[75946]: 2025-12-01 09:12:49.413095774 +0000 UTC m=+0.106965143 container init 4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2 (image=quay.io/ceph/ceph:v18, name=crazy_dijkstra, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:12:49 np0005540741 podman[75946]: 2025-12-01 09:12:49.420233407 +0000 UTC m=+0.114102766 container start 4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2 (image=quay.io/ceph/ceph:v18, name=crazy_dijkstra, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:12:49 np0005540741 podman[75946]: 2025-12-01 09:12:49.42386262 +0000 UTC m=+0.117731979 container attach 4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2 (image=quay.io/ceph/ceph:v18, name=crazy_dijkstra, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  1 04:12:49 np0005540741 podman[75946]: 2025-12-01 09:12:49.329764224 +0000 UTC m=+0.023633593 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:50 np0005540741 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec  1 04:12:50 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config assimilate-conf"} v 0) v1
Dec  1 04:12:50 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1715548196' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Dec  1 04:12:50 np0005540741 systemd[1]: libpod-4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2.scope: Deactivated successfully.
Dec  1 04:12:50 np0005540741 podman[75946]: 2025-12-01 09:12:50.586633781 +0000 UTC m=+1.280503140 container died 4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2 (image=quay.io/ceph/ceph:v18, name=crazy_dijkstra, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec  1 04:12:50 np0005540741 systemd[1]: var-lib-containers-storage-overlay-e5a6b58ab70cd89a09667bf796c174534b2436d214e6e38a0d0edb79b9ece471-merged.mount: Deactivated successfully.
Dec  1 04:12:50 np0005540741 podman[75946]: 2025-12-01 09:12:50.907570577 +0000 UTC m=+1.601439936 container remove 4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2 (image=quay.io/ceph/ceph:v18, name=crazy_dijkstra, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  1 04:12:50 np0005540741 systemd[1]: libpod-conmon-4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2.scope: Deactivated successfully.
Dec  1 04:12:51 np0005540741 podman[76006]: 2025-12-01 09:12:50.953871813 +0000 UTC m=+0.025317311 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:51 np0005540741 podman[76006]: 2025-12-01 09:12:51.072480306 +0000 UTC m=+0.143925784 container create 8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864 (image=quay.io/ceph/ceph:v18, name=practical_pare, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:12:51 np0005540741 systemd[1]: Started libpod-conmon-8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864.scope.
Dec  1 04:12:51 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:51 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb09c81a4e4c0f6e89c4aab9bb9e178eb2b1e96581aa46cf1d20bd3dc8d004ca/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:51 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb09c81a4e4c0f6e89c4aab9bb9e178eb2b1e96581aa46cf1d20bd3dc8d004ca/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:51 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb09c81a4e4c0f6e89c4aab9bb9e178eb2b1e96581aa46cf1d20bd3dc8d004ca/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:51 np0005540741 podman[76006]: 2025-12-01 09:12:51.160478188 +0000 UTC m=+0.231923696 container init 8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864 (image=quay.io/ceph/ceph:v18, name=practical_pare, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:12:51 np0005540741 podman[76006]: 2025-12-01 09:12:51.172381346 +0000 UTC m=+0.243826824 container start 8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864 (image=quay.io/ceph/ceph:v18, name=practical_pare, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:12:51 np0005540741 podman[76006]: 2025-12-01 09:12:51.176213485 +0000 UTC m=+0.247659013 container attach 8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864 (image=quay.io/ceph/ceph:v18, name=practical_pare, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:12:51 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module enable", "module": "cephadm"} v 0) v1
Dec  1 04:12:51 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2695057660' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "cephadm"}]: dispatch
Dec  1 04:12:51 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/1715548196' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Dec  1 04:12:51 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/2695057660' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "cephadm"}]: dispatch
Dec  1 04:12:51 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2695057660' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "cephadm"}]': finished
Dec  1 04:12:51 np0005540741 ceph-mgr[75324]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  1 04:12:51 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e5: compute-0.psduho(active, since 5s)
Dec  1 04:12:51 np0005540741 systemd[1]: libpod-8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864.scope: Deactivated successfully.
Dec  1 04:12:51 np0005540741 podman[76006]: 2025-12-01 09:12:51.84711054 +0000 UTC m=+0.918556008 container died 8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864 (image=quay.io/ceph/ceph:v18, name=practical_pare, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:12:51 np0005540741 systemd[1]: var-lib-containers-storage-overlay-eb09c81a4e4c0f6e89c4aab9bb9e178eb2b1e96581aa46cf1d20bd3dc8d004ca-merged.mount: Deactivated successfully.
Dec  1 04:12:51 np0005540741 podman[76006]: 2025-12-01 09:12:51.888734883 +0000 UTC m=+0.960180351 container remove 8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864 (image=quay.io/ceph/ceph:v18, name=practical_pare, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec  1 04:12:51 np0005540741 systemd[1]: libpod-conmon-8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864.scope: Deactivated successfully.
Dec  1 04:12:51 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: ignoring --setuser ceph since I am not root
Dec  1 04:12:51 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: ignoring --setgroup ceph since I am not root
Dec  1 04:12:51 np0005540741 ceph-mgr[75324]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Dec  1 04:12:51 np0005540741 ceph-mgr[75324]: pidfile_write: ignore empty --pid-file
Dec  1 04:12:51 np0005540741 podman[76060]: 2025-12-01 09:12:51.952811575 +0000 UTC m=+0.043085946 container create 74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda (image=quay.io/ceph/ceph:v18, name=distracted_solomon, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:12:51 np0005540741 systemd[1]: Started libpod-conmon-74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda.scope.
Dec  1 04:12:52 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abc68b8bfee08fe848bc987b6a9c311f5ea87ec70cd30290aa47435cc3b00937/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abc68b8bfee08fe848bc987b6a9c311f5ea87ec70cd30290aa47435cc3b00937/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abc68b8bfee08fe848bc987b6a9c311f5ea87ec70cd30290aa47435cc3b00937/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:52 np0005540741 podman[76060]: 2025-12-01 09:12:52.031147313 +0000 UTC m=+0.121421714 container init 74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda (image=quay.io/ceph/ceph:v18, name=distracted_solomon, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:12:52 np0005540741 podman[76060]: 2025-12-01 09:12:51.935505373 +0000 UTC m=+0.025779784 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:52 np0005540741 podman[76060]: 2025-12-01 09:12:52.037416011 +0000 UTC m=+0.127690382 container start 74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda (image=quay.io/ceph/ceph:v18, name=distracted_solomon, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:12:52 np0005540741 podman[76060]: 2025-12-01 09:12:52.040693974 +0000 UTC m=+0.130968375 container attach 74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda (image=quay.io/ceph/ceph:v18, name=distracted_solomon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec  1 04:12:52 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'alerts'
Dec  1 04:12:52 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:52.373+0000 7fd338ad4140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:12:52 np0005540741 ceph-mgr[75324]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:12:52 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'balancer'
Dec  1 04:12:52 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Dec  1 04:12:52 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1686219134' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec  1 04:12:52 np0005540741 distracted_solomon[76100]: {
Dec  1 04:12:52 np0005540741 distracted_solomon[76100]:    "epoch": 5,
Dec  1 04:12:52 np0005540741 distracted_solomon[76100]:    "available": true,
Dec  1 04:12:52 np0005540741 distracted_solomon[76100]:    "active_name": "compute-0.psduho",
Dec  1 04:12:52 np0005540741 distracted_solomon[76100]:    "num_standby": 0
Dec  1 04:12:52 np0005540741 distracted_solomon[76100]: }
Dec  1 04:12:52 np0005540741 systemd[1]: libpod-74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda.scope: Deactivated successfully.
Dec  1 04:12:52 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:52.646+0000 7fd338ad4140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:12:52 np0005540741 ceph-mgr[75324]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:12:52 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'cephadm'
Dec  1 04:12:52 np0005540741 podman[76126]: 2025-12-01 09:12:52.670646196 +0000 UTC m=+0.026419862 container died 74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda (image=quay.io/ceph/ceph:v18, name=distracted_solomon, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec  1 04:12:52 np0005540741 systemd[1]: var-lib-containers-storage-overlay-abc68b8bfee08fe848bc987b6a9c311f5ea87ec70cd30290aa47435cc3b00937-merged.mount: Deactivated successfully.
Dec  1 04:12:52 np0005540741 podman[76126]: 2025-12-01 09:12:52.706821634 +0000 UTC m=+0.062595280 container remove 74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda (image=quay.io/ceph/ceph:v18, name=distracted_solomon, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec  1 04:12:52 np0005540741 systemd[1]: libpod-conmon-74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda.scope: Deactivated successfully.
Dec  1 04:12:52 np0005540741 podman[76141]: 2025-12-01 09:12:52.754745257 +0000 UTC m=+0.020720400 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:12:52 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/2695057660' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "cephadm"}]': finished
Dec  1 04:12:52 np0005540741 podman[76141]: 2025-12-01 09:12:52.92332112 +0000 UTC m=+0.189296243 container create 237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1 (image=quay.io/ceph/ceph:v18, name=silly_gauss, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:12:52 np0005540741 systemd[1]: Started libpod-conmon-237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1.scope.
Dec  1 04:12:52 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:12:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533f47fcdf700e821496873914eb663c7b4db46799c7c8f6833ce92cb64b484c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533f47fcdf700e821496873914eb663c7b4db46799c7c8f6833ce92cb64b484c/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533f47fcdf700e821496873914eb663c7b4db46799c7c8f6833ce92cb64b484c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:12:52 np0005540741 podman[76141]: 2025-12-01 09:12:52.996164391 +0000 UTC m=+0.262139544 container init 237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1 (image=quay.io/ceph/ceph:v18, name=silly_gauss, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:12:53 np0005540741 podman[76141]: 2025-12-01 09:12:53.000879095 +0000 UTC m=+0.266854218 container start 237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1 (image=quay.io/ceph/ceph:v18, name=silly_gauss, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:12:53 np0005540741 podman[76141]: 2025-12-01 09:12:53.004523909 +0000 UTC m=+0.270499032 container attach 237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1 (image=quay.io/ceph/ceph:v18, name=silly_gauss, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:12:54 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'crash'
Dec  1 04:12:54 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:54.939+0000 7fd338ad4140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:12:54 np0005540741 ceph-mgr[75324]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:12:54 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'dashboard'
Dec  1 04:12:56 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'devicehealth'
Dec  1 04:12:56 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:56.732+0000 7fd338ad4140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:12:56 np0005540741 ceph-mgr[75324]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:12:56 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'diskprediction_local'
Dec  1 04:12:57 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  1 04:12:57 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  1 04:12:57 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]:  from numpy import show_config as show_numpy_config
Dec  1 04:12:57 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:57.314+0000 7fd338ad4140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:12:57 np0005540741 ceph-mgr[75324]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:12:57 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'influx'
Dec  1 04:12:57 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:57.595+0000 7fd338ad4140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:12:57 np0005540741 ceph-mgr[75324]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:12:57 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'insights'
Dec  1 04:12:57 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'iostat'
Dec  1 04:12:58 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:58.109+0000 7fd338ad4140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:12:58 np0005540741 ceph-mgr[75324]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:12:58 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'k8sevents'
Dec  1 04:12:59 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'localpool'
Dec  1 04:13:00 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'mds_autoscaler'
Dec  1 04:13:00 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'mirroring'
Dec  1 04:13:01 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'nfs'
Dec  1 04:13:01 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:01.955+0000 7fd338ad4140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:13:01 np0005540741 ceph-mgr[75324]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:13:01 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'orchestrator'
Dec  1 04:13:02 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:02.680+0000 7fd338ad4140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:13:02 np0005540741 ceph-mgr[75324]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:13:02 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'osd_perf_query'
Dec  1 04:13:02 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:02.978+0000 7fd338ad4140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:13:02 np0005540741 ceph-mgr[75324]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:13:02 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'osd_support'
Dec  1 04:13:03 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:03.251+0000 7fd338ad4140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:13:03 np0005540741 ceph-mgr[75324]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:13:03 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'pg_autoscaler'
Dec  1 04:13:03 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:03.549+0000 7fd338ad4140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:13:03 np0005540741 ceph-mgr[75324]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:13:03 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'progress'
Dec  1 04:13:03 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:03.789+0000 7fd338ad4140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:13:03 np0005540741 ceph-mgr[75324]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:13:03 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'prometheus'
Dec  1 04:13:04 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:04.860+0000 7fd338ad4140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:13:04 np0005540741 ceph-mgr[75324]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:13:04 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'rbd_support'
Dec  1 04:13:05 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:05.184+0000 7fd338ad4140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:13:05 np0005540741 ceph-mgr[75324]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:13:05 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'restful'
Dec  1 04:13:05 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'rgw'
Dec  1 04:13:06 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:06.688+0000 7fd338ad4140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:13:06 np0005540741 ceph-mgr[75324]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:13:06 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'rook'
Dec  1 04:13:09 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:09.058+0000 7fd338ad4140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:13:09 np0005540741 ceph-mgr[75324]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:13:09 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'selftest'
Dec  1 04:13:09 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:09.321+0000 7fd338ad4140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:13:09 np0005540741 ceph-mgr[75324]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:13:09 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'snap_schedule'
Dec  1 04:13:09 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:09.594+0000 7fd338ad4140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:13:09 np0005540741 ceph-mgr[75324]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:13:09 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'stats'
Dec  1 04:13:09 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'status'
Dec  1 04:13:10 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:10.173+0000 7fd338ad4140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:13:10 np0005540741 ceph-mgr[75324]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:13:10 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'telegraf'
Dec  1 04:13:10 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:10.431+0000 7fd338ad4140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:13:10 np0005540741 ceph-mgr[75324]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:13:10 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'telemetry'
Dec  1 04:13:11 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:11.136+0000 7fd338ad4140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:13:11 np0005540741 ceph-mgr[75324]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:13:11 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'test_orchestrator'
Dec  1 04:13:11 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:11.898+0000 7fd338ad4140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:13:11 np0005540741 ceph-mgr[75324]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:13:11 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'volumes'
Dec  1 04:13:12 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:12.682+0000 7fd338ad4140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: mgr[py] Loading python module 'zabbix'
Dec  1 04:13:12 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:12.923+0000 7fd338ad4140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : Active manager daemon compute-0.psduho restarted
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e1 do_prune osdmap full prune enabled
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e1 encode_pending skipping prime_pg_temp; mapping job did not start
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : Activating manager daemon compute-0.psduho
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: ms_deliver_dispatch: unhandled message 0x56106644b1e0 mon_map magic: 0 v1 from mon.0 v2:192.168.122.100:3300/0
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e1 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e1 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 e2: 0 total, 0 up, 0 in
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: mgr handle_mgr_map Activating!
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: mgr handle_mgr_map I am now activating
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e2: 0 total, 0 up, 0 in
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e6: compute-0.psduho(active, starting, since 0.0169613s)
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon metadata", "id": "compute-0"} v 0) v1
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata", "who": "compute-0.psduho", "id": "compute-0.psduho"} v 0) v1
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mgr metadata", "who": "compute-0.psduho", "id": "compute-0.psduho"}]: dispatch
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mds metadata"} v 0) v1
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mds metadata"}]: dispatch
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).mds e1 all = 1
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon metadata"} v 0) v1
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mon metadata"}]: dispatch
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: balancer
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : Manager daemon compute-0.psduho is now available
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Starting
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:13:12
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] No pools available
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.migrations] Found migration_current of "None". Setting to last migration.
Dec  1 04:13:12 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Found migration_current of "None". Setting to last migration.
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mgr/cephadm/migration_current}] v 0) v1
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: Active manager daemon compute-0.psduho restarted
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: Activating manager daemon compute-0.psduho
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: Manager daemon compute-0.psduho is now available
Dec  1 04:13:12 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/config_checks}] v 0) v1
Dec  1 04:13:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: cephadm
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: crash
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: devicehealth
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [devicehealth INFO root] Starting
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: iostat
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: nfs
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: orchestrator
Dec  1 04:13:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) v1
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: pg_autoscaler
Dec  1 04:13:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: progress
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [progress INFO root] Loading...
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [progress INFO root] No stored events to load
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [progress INFO root] Loaded [] historic events
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [progress INFO root] Loaded OSDMap, ready.
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:13:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) v1
Dec  1 04:13:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] recovery thread starting
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] starting setup
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: rbd_support
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: restful
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: status
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [restful INFO root] server_addr: :: server_port: 8003
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [restful WARNING root] server not running: no certificate configured
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: telemetry
Dec  1 04:13:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/mirror_snapshot_schedule"} v 0) v1
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:13:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/mirror_snapshot_schedule"}]: dispatch
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] PerfHandler: starting
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TaskHandler: starting
Dec  1 04:13:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/trash_purge_schedule"} v 0) v1
Dec  1 04:13:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/trash_purge_schedule"}]: dispatch
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] setup complete
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: mgr load Constructed class from module: volumes
Dec  1 04:13:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/cephadm_agent/root/cert}] v 0) v1
Dec  1 04:13:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/cephadm_agent/root/key}] v 0) v1
Dec  1 04:13:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:13 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e7: compute-0.psduho(active, since 1.0245s)
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14136 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch
Dec  1 04:13:13 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14136 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch
Dec  1 04:13:13 np0005540741 silly_gauss[76158]: {
Dec  1 04:13:13 np0005540741 silly_gauss[76158]:    "mgrmap_epoch": 7,
Dec  1 04:13:13 np0005540741 silly_gauss[76158]:    "initialized": true
Dec  1 04:13:13 np0005540741 silly_gauss[76158]: }
Dec  1 04:13:13 np0005540741 systemd[1]: libpod-237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1.scope: Deactivated successfully.
Dec  1 04:13:13 np0005540741 podman[76141]: 2025-12-01 09:13:13.983281453 +0000 UTC m=+21.249256596 container died 237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1 (image=quay.io/ceph/ceph:v18, name=silly_gauss, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec  1 04:13:14 np0005540741 ceph-mon[75031]: Found migration_current of "None". Setting to last migration.
Dec  1 04:13:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/mirror_snapshot_schedule"}]: dispatch
Dec  1 04:13:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/trash_purge_schedule"}]: dispatch
Dec  1 04:13:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:14 np0005540741 systemd[1]: var-lib-containers-storage-overlay-533f47fcdf700e821496873914eb663c7b4db46799c7c8f6833ce92cb64b484c-merged.mount: Deactivated successfully.
Dec  1 04:13:14 np0005540741 podman[76141]: 2025-12-01 09:13:14.033849153 +0000 UTC m=+21.299824276 container remove 237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1 (image=quay.io/ceph/ceph:v18, name=silly_gauss, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  1 04:13:14 np0005540741 systemd[1]: libpod-conmon-237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1.scope: Deactivated successfully.
Dec  1 04:13:14 np0005540741 podman[76317]: 2025-12-01 09:13:14.103891552 +0000 UTC m=+0.045369677 container create 201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c (image=quay.io/ceph/ceph:v18, name=pedantic_shockley, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec  1 04:13:14 np0005540741 systemd[1]: Started libpod-conmon-201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c.scope.
Dec  1 04:13:14 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:14 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b31016c66e5295f453bf89da43e12b3696433ba52c0994a26dde030a8f94cfbb/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:14 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b31016c66e5295f453bf89da43e12b3696433ba52c0994a26dde030a8f94cfbb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:14 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b31016c66e5295f453bf89da43e12b3696433ba52c0994a26dde030a8f94cfbb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:14 np0005540741 podman[76317]: 2025-12-01 09:13:14.168257602 +0000 UTC m=+0.109735727 container init 201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c (image=quay.io/ceph/ceph:v18, name=pedantic_shockley, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:13:14 np0005540741 podman[76317]: 2025-12-01 09:13:14.174859698 +0000 UTC m=+0.116337813 container start 201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c (image=quay.io/ceph/ceph:v18, name=pedantic_shockley, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:13:14 np0005540741 podman[76317]: 2025-12-01 09:13:14.177714723 +0000 UTC m=+0.119192878 container attach 201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c (image=quay.io/ceph/ceph:v18, name=pedantic_shockley, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:13:14 np0005540741 podman[76317]: 2025-12-01 09:13:14.084419244 +0000 UTC m=+0.025897379 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:14 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14144 -' entity='client.admin' cmd=[{"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:13:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mgr/orchestrator/orchestrator}] v 0) v1
Dec  1 04:13:14 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) v1
Dec  1 04:13:14 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec  1 04:13:14 np0005540741 systemd[1]: libpod-201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c.scope: Deactivated successfully.
Dec  1 04:13:14 np0005540741 podman[76317]: 2025-12-01 09:13:14.773311848 +0000 UTC m=+0.714789963 container died 201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c (image=quay.io/ceph/ceph:v18, name=pedantic_shockley, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  1 04:13:14 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b31016c66e5295f453bf89da43e12b3696433ba52c0994a26dde030a8f94cfbb-merged.mount: Deactivated successfully.
Dec  1 04:13:14 np0005540741 podman[76317]: 2025-12-01 09:13:14.814628314 +0000 UTC m=+0.756106429 container remove 201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c (image=quay.io/ceph/ceph:v18, name=pedantic_shockley, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:13:14 np0005540741 systemd[1]: libpod-conmon-201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c.scope: Deactivated successfully.
Dec  1 04:13:14 np0005540741 podman[76372]: 2025-12-01 09:13:14.873997965 +0000 UTC m=+0.037827873 container create 1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d (image=quay.io/ceph/ceph:v18, name=wizardly_goodall, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef)
Dec  1 04:13:14 np0005540741 ceph-mgr[75324]: [cephadm INFO cherrypy.error] [01/Dec/2025:09:13:14] ENGINE Bus STARTING
Dec  1 04:13:14 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : [01/Dec/2025:09:13:14] ENGINE Bus STARTING
Dec  1 04:13:14 np0005540741 systemd[1]: Started libpod-conmon-1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d.scope.
Dec  1 04:13:14 np0005540741 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec  1 04:13:14 np0005540741 podman[76372]: 2025-12-01 09:13:14.857478585 +0000 UTC m=+0.021308513 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:14 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:14 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4730bdafe340a010fdaa835cc9e1c51938fc458a8e1816a68a54f086582c4e14/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:14 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4730bdafe340a010fdaa835cc9e1c51938fc458a8e1816a68a54f086582c4e14/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:14 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4730bdafe340a010fdaa835cc9e1c51938fc458a8e1816a68a54f086582c4e14/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:15 np0005540741 ceph-mgr[75324]: [cephadm INFO cherrypy.error] [01/Dec/2025:09:13:15] ENGINE Serving on http://192.168.122.100:8765
Dec  1 04:13:15 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : [01/Dec/2025:09:13:15] ENGINE Serving on http://192.168.122.100:8765
Dec  1 04:13:15 np0005540741 podman[76372]: 2025-12-01 09:13:15.097472507 +0000 UTC m=+0.261302435 container init 1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d (image=quay.io/ceph/ceph:v18, name=wizardly_goodall, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 04:13:15 np0005540741 podman[76372]: 2025-12-01 09:13:15.102570068 +0000 UTC m=+0.266399976 container start 1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d (image=quay.io/ceph/ceph:v18, name=wizardly_goodall, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec  1 04:13:15 np0005540741 ceph-mgr[75324]: [cephadm INFO cherrypy.error] [01/Dec/2025:09:13:15] ENGINE Serving on https://192.168.122.100:7150
Dec  1 04:13:15 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : [01/Dec/2025:09:13:15] ENGINE Serving on https://192.168.122.100:7150
Dec  1 04:13:15 np0005540741 ceph-mgr[75324]: [cephadm INFO cherrypy.error] [01/Dec/2025:09:13:15] ENGINE Bus STARTED
Dec  1 04:13:15 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : [01/Dec/2025:09:13:15] ENGINE Bus STARTED
Dec  1 04:13:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) v1
Dec  1 04:13:15 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec  1 04:13:15 np0005540741 ceph-mgr[75324]: [cephadm INFO cherrypy.error] [01/Dec/2025:09:13:15] ENGINE Client ('192.168.122.100', 55524) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec  1 04:13:15 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : [01/Dec/2025:09:13:15] ENGINE Client ('192.168.122.100', 55524) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec  1 04:13:15 np0005540741 podman[76372]: 2025-12-01 09:13:15.27278434 +0000 UTC m=+0.436614278 container attach 1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d (image=quay.io/ceph/ceph:v18, name=wizardly_goodall, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:13:15 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14146 -' entity='client.admin' cmd=[{"prefix": "cephadm set-user", "user": "ceph-admin", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:13:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/ssh_user}] v 0) v1
Dec  1 04:13:15 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:15 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Set ssh ssh_user
Dec  1 04:13:15 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Set ssh ssh_user
Dec  1 04:13:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/ssh_config}] v 0) v1
Dec  1 04:13:15 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:15 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Set ssh ssh_config
Dec  1 04:13:15 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Set ssh ssh_config
Dec  1 04:13:15 np0005540741 ceph-mgr[75324]: [cephadm INFO root] ssh user set to ceph-admin. sudo will be used
Dec  1 04:13:15 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : ssh user set to ceph-admin. sudo will be used
Dec  1 04:13:15 np0005540741 wizardly_goodall[76399]: ssh user set to ceph-admin. sudo will be used
Dec  1 04:13:15 np0005540741 systemd[1]: libpod-1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d.scope: Deactivated successfully.
Dec  1 04:13:15 np0005540741 podman[76372]: 2025-12-01 09:13:15.6889674 +0000 UTC m=+0.852797308 container died 1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d (image=quay.io/ceph/ceph:v18, name=wizardly_goodall, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:13:15 np0005540741 systemd[1]: var-lib-containers-storage-overlay-4730bdafe340a010fdaa835cc9e1c51938fc458a8e1816a68a54f086582c4e14-merged.mount: Deactivated successfully.
Dec  1 04:13:15 np0005540741 podman[76372]: 2025-12-01 09:13:15.739521181 +0000 UTC m=+0.903351089 container remove 1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d (image=quay.io/ceph/ceph:v18, name=wizardly_goodall, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:13:15 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:15 np0005540741 ceph-mon[75031]: [01/Dec/2025:09:13:14] ENGINE Bus STARTING
Dec  1 04:13:15 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:15 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:15 np0005540741 systemd[1]: libpod-conmon-1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d.scope: Deactivated successfully.
Dec  1 04:13:15 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e8: compute-0.psduho(active, since 2s)
Dec  1 04:13:15 np0005540741 podman[76448]: 2025-12-01 09:13:15.796918644 +0000 UTC m=+0.034485894 container create a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8 (image=quay.io/ceph/ceph:v18, name=interesting_chaum, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec  1 04:13:15 np0005540741 systemd[1]: Started libpod-conmon-a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8.scope.
Dec  1 04:13:15 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35674a77d3051546ebbf85c0e5ec1de6b626355bbbc277db15c37d8176afb885/merged/tmp/cephadm-ssh-key supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35674a77d3051546ebbf85c0e5ec1de6b626355bbbc277db15c37d8176afb885/merged/tmp/cephadm-ssh-key.pub supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35674a77d3051546ebbf85c0e5ec1de6b626355bbbc277db15c37d8176afb885/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35674a77d3051546ebbf85c0e5ec1de6b626355bbbc277db15c37d8176afb885/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35674a77d3051546ebbf85c0e5ec1de6b626355bbbc277db15c37d8176afb885/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:15 np0005540741 podman[76448]: 2025-12-01 09:13:15.864036436 +0000 UTC m=+0.101603686 container init a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8 (image=quay.io/ceph/ceph:v18, name=interesting_chaum, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  1 04:13:15 np0005540741 podman[76448]: 2025-12-01 09:13:15.869042174 +0000 UTC m=+0.106609424 container start a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8 (image=quay.io/ceph/ceph:v18, name=interesting_chaum, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:13:15 np0005540741 podman[76448]: 2025-12-01 09:13:15.8719198 +0000 UTC m=+0.109487070 container attach a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8 (image=quay.io/ceph/ceph:v18, name=interesting_chaum, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:13:15 np0005540741 podman[76448]: 2025-12-01 09:13:15.782355102 +0000 UTC m=+0.019922362 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:16 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14148 -' entity='client.admin' cmd=[{"prefix": "cephadm set-priv-key", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:13:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/ssh_identity_key}] v 0) v1
Dec  1 04:13:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:16 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Set ssh ssh_identity_key
Dec  1 04:13:16 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Set ssh ssh_identity_key
Dec  1 04:13:16 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Set ssh private key
Dec  1 04:13:16 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Set ssh private key
Dec  1 04:13:16 np0005540741 systemd[1]: libpod-a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8.scope: Deactivated successfully.
Dec  1 04:13:16 np0005540741 podman[76448]: 2025-12-01 09:13:16.498776431 +0000 UTC m=+0.736343681 container died a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8 (image=quay.io/ceph/ceph:v18, name=interesting_chaum, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2)
Dec  1 04:13:16 np0005540741 systemd[1]: var-lib-containers-storage-overlay-35674a77d3051546ebbf85c0e5ec1de6b626355bbbc277db15c37d8176afb885-merged.mount: Deactivated successfully.
Dec  1 04:13:16 np0005540741 podman[76448]: 2025-12-01 09:13:16.540414917 +0000 UTC m=+0.777982167 container remove a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8 (image=quay.io/ceph/ceph:v18, name=interesting_chaum, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec  1 04:13:16 np0005540741 systemd[1]: libpod-conmon-a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8.scope: Deactivated successfully.
Dec  1 04:13:16 np0005540741 podman[76504]: 2025-12-01 09:13:16.595382908 +0000 UTC m=+0.037223796 container create b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3 (image=quay.io/ceph/ceph:v18, name=friendly_archimedes, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec  1 04:13:16 np0005540741 systemd[1]: Started libpod-conmon-b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3.scope.
Dec  1 04:13:16 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:16 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a42446b91196bd36de3d655ae987bd74aadf5d1572355a6d885bbde74432781d/merged/tmp/cephadm-ssh-key supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:16 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a42446b91196bd36de3d655ae987bd74aadf5d1572355a6d885bbde74432781d/merged/tmp/cephadm-ssh-key.pub supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:16 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a42446b91196bd36de3d655ae987bd74aadf5d1572355a6d885bbde74432781d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:16 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a42446b91196bd36de3d655ae987bd74aadf5d1572355a6d885bbde74432781d/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:16 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a42446b91196bd36de3d655ae987bd74aadf5d1572355a6d885bbde74432781d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:16 np0005540741 podman[76504]: 2025-12-01 09:13:16.647898216 +0000 UTC m=+0.089739134 container init b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3 (image=quay.io/ceph/ceph:v18, name=friendly_archimedes, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec  1 04:13:16 np0005540741 podman[76504]: 2025-12-01 09:13:16.655262815 +0000 UTC m=+0.097103703 container start b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3 (image=quay.io/ceph/ceph:v18, name=friendly_archimedes, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:13:16 np0005540741 podman[76504]: 2025-12-01 09:13:16.658193352 +0000 UTC m=+0.100034270 container attach b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3 (image=quay.io/ceph/ceph:v18, name=friendly_archimedes, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef)
Dec  1 04:13:16 np0005540741 podman[76504]: 2025-12-01 09:13:16.577635091 +0000 UTC m=+0.019475999 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:16 np0005540741 ceph-mon[75031]: [01/Dec/2025:09:13:15] ENGINE Serving on http://192.168.122.100:8765
Dec  1 04:13:16 np0005540741 ceph-mon[75031]: [01/Dec/2025:09:13:15] ENGINE Serving on https://192.168.122.100:7150
Dec  1 04:13:16 np0005540741 ceph-mon[75031]: [01/Dec/2025:09:13:15] ENGINE Bus STARTED
Dec  1 04:13:16 np0005540741 ceph-mon[75031]: [01/Dec/2025:09:13:15] ENGINE Client ('192.168.122.100', 55524) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec  1 04:13:16 np0005540741 ceph-mon[75031]: Set ssh ssh_user
Dec  1 04:13:16 np0005540741 ceph-mon[75031]: Set ssh ssh_config
Dec  1 04:13:16 np0005540741 ceph-mon[75031]: ssh user set to ceph-admin. sudo will be used
Dec  1 04:13:16 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:16 np0005540741 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec  1 04:13:17 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14150 -' entity='client.admin' cmd=[{"prefix": "cephadm set-pub-key", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:13:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/ssh_identity_pub}] v 0) v1
Dec  1 04:13:17 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:17 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Set ssh ssh_identity_pub
Dec  1 04:13:17 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Set ssh ssh_identity_pub
Dec  1 04:13:17 np0005540741 systemd[1]: libpod-b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3.scope: Deactivated successfully.
Dec  1 04:13:17 np0005540741 conmon[76521]: conmon b0082ac6a6cfe79c40f4 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3.scope/container/memory.events
Dec  1 04:13:17 np0005540741 podman[76504]: 2025-12-01 09:13:17.196523047 +0000 UTC m=+0.638363935 container died b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3 (image=quay.io/ceph/ceph:v18, name=friendly_archimedes, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec  1 04:13:17 np0005540741 systemd[1]: var-lib-containers-storage-overlay-a42446b91196bd36de3d655ae987bd74aadf5d1572355a6d885bbde74432781d-merged.mount: Deactivated successfully.
Dec  1 04:13:17 np0005540741 podman[76504]: 2025-12-01 09:13:17.236833953 +0000 UTC m=+0.678674841 container remove b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3 (image=quay.io/ceph/ceph:v18, name=friendly_archimedes, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 04:13:17 np0005540741 systemd[1]: libpod-conmon-b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3.scope: Deactivated successfully.
Dec  1 04:13:17 np0005540741 podman[76558]: 2025-12-01 09:13:17.29331975 +0000 UTC m=+0.038447822 container create 018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf (image=quay.io/ceph/ceph:v18, name=eloquent_moore, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:13:17 np0005540741 systemd[1]: Started libpod-conmon-018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf.scope.
Dec  1 04:13:17 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c8c0f145ad1053bea93ac7a3845cb01d29d92d1f875b2726294470e9fca2d71/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c8c0f145ad1053bea93ac7a3845cb01d29d92d1f875b2726294470e9fca2d71/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c8c0f145ad1053bea93ac7a3845cb01d29d92d1f875b2726294470e9fca2d71/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:17 np0005540741 podman[76558]: 2025-12-01 09:13:17.35163246 +0000 UTC m=+0.096760542 container init 018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf (image=quay.io/ceph/ceph:v18, name=eloquent_moore, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  1 04:13:17 np0005540741 podman[76558]: 2025-12-01 09:13:17.356583627 +0000 UTC m=+0.101711689 container start 018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf (image=quay.io/ceph/ceph:v18, name=eloquent_moore, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:13:17 np0005540741 podman[76558]: 2025-12-01 09:13:17.359624797 +0000 UTC m=+0.104752859 container attach 018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf (image=quay.io/ceph/ceph:v18, name=eloquent_moore, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:13:17 np0005540741 podman[76558]: 2025-12-01 09:13:17.275601754 +0000 UTC m=+0.020729846 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:17 np0005540741 ceph-mon[75031]: Set ssh ssh_identity_key
Dec  1 04:13:17 np0005540741 ceph-mon[75031]: Set ssh private key
Dec  1 04:13:17 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:17 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14152 -' entity='client.admin' cmd=[{"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:13:17 np0005540741 eloquent_moore[76574]: ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDD5sBspn5Xp7CW5nM5RD3iftue/dLFDaYdYMKUOd5ZIzzsN79oQmlltrvDT8HaitCYNC57cVlt7wHN9td+6mcvwmDWLz3o/V+FnCmxsy48GjfcC1QefkBt+998r7HDAx63TkBJ1854CvC2wj92eb6gIWJA7E2cOUv5PCGoqcgonQyMLYSYm4G9uoxpEJeXbBaB94cMGl+dCQbIg8yOceqUlnoZ2+ACyrkBUxbI9JmOBw29M1PMIaYdFW7urAEFJRiovkfYNPVJeMZZNYL4efjsE13flKatlgazaIJqjHCrthfeRq1Hj7qpaJYaubPdhUFuXb2qqTVg/lOwO/R4VJVZDbyOOpncF0p5pv4pZkvb3qMGCM605lN8C8aHi8734oLSBYIDtVMA4HgPLo6nbUtCrzvqfceioWkYNymprvj5Wm/jN1gAtEf9mf4ZPu0uuzHCLbku5lddg770u13ZPqylHCrgjIxnrvb4jygvTBd1myq7uHVdI/518cEH53q0hA0= zuul@controller
Dec  1 04:13:17 np0005540741 systemd[1]: libpod-018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf.scope: Deactivated successfully.
Dec  1 04:13:17 np0005540741 podman[76558]: 2025-12-01 09:13:17.912421382 +0000 UTC m=+0.657549484 container died 018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf (image=quay.io/ceph/ceph:v18, name=eloquent_moore, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:13:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1019921889 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:13:18 np0005540741 systemd[1]: var-lib-containers-storage-overlay-2c8c0f145ad1053bea93ac7a3845cb01d29d92d1f875b2726294470e9fca2d71-merged.mount: Deactivated successfully.
Dec  1 04:13:18 np0005540741 podman[76558]: 2025-12-01 09:13:18.02657624 +0000 UTC m=+0.771704302 container remove 018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf (image=quay.io/ceph/ceph:v18, name=eloquent_moore, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec  1 04:13:18 np0005540741 systemd[1]: libpod-conmon-018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf.scope: Deactivated successfully.
Dec  1 04:13:18 np0005540741 podman[76612]: 2025-12-01 09:13:18.080284543 +0000 UTC m=+0.035610287 container create 91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172 (image=quay.io/ceph/ceph:v18, name=zen_mahavira, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:13:18 np0005540741 systemd[1]: Started libpod-conmon-91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172.scope.
Dec  1 04:13:18 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/746a7322c464db3f4e8fd9d40f5c3e04af5c9ab61050d1af559f36551205884d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/746a7322c464db3f4e8fd9d40f5c3e04af5c9ab61050d1af559f36551205884d/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/746a7322c464db3f4e8fd9d40f5c3e04af5c9ab61050d1af559f36551205884d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:18 np0005540741 podman[76612]: 2025-12-01 09:13:18.143084647 +0000 UTC m=+0.098410391 container init 91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172 (image=quay.io/ceph/ceph:v18, name=zen_mahavira, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec  1 04:13:18 np0005540741 podman[76612]: 2025-12-01 09:13:18.149662452 +0000 UTC m=+0.104988216 container start 91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172 (image=quay.io/ceph/ceph:v18, name=zen_mahavira, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec  1 04:13:18 np0005540741 podman[76612]: 2025-12-01 09:13:18.153567428 +0000 UTC m=+0.108893272 container attach 91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172 (image=quay.io/ceph/ceph:v18, name=zen_mahavira, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:13:18 np0005540741 podman[76612]: 2025-12-01 09:13:18.065167405 +0000 UTC m=+0.020493169 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:18 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14154 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "compute-0", "addr": "192.168.122.100", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:13:18 np0005540741 ceph-mon[75031]: Set ssh ssh_identity_pub
Dec  1 04:13:18 np0005540741 systemd[1]: Created slice User Slice of UID 42477.
Dec  1 04:13:18 np0005540741 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec  1 04:13:18 np0005540741 systemd-logind[788]: New session 21 of user ceph-admin.
Dec  1 04:13:18 np0005540741 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec  1 04:13:18 np0005540741 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec  1 04:13:18 np0005540741 systemd[1]: Starting User Manager for UID 42477...
Dec  1 04:13:19 np0005540741 systemd[76658]: Queued start job for default target Main User Target.
Dec  1 04:13:19 np0005540741 systemd-logind[788]: New session 23 of user ceph-admin.
Dec  1 04:13:19 np0005540741 systemd[76658]: Created slice User Application Slice.
Dec  1 04:13:19 np0005540741 systemd[76658]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  1 04:13:19 np0005540741 systemd[76658]: Started Daily Cleanup of User's Temporary Directories.
Dec  1 04:13:19 np0005540741 systemd[76658]: Reached target Paths.
Dec  1 04:13:19 np0005540741 systemd[76658]: Reached target Timers.
Dec  1 04:13:19 np0005540741 systemd[76658]: Starting D-Bus User Message Bus Socket...
Dec  1 04:13:19 np0005540741 systemd[76658]: Starting Create User's Volatile Files and Directories...
Dec  1 04:13:19 np0005540741 systemd[76658]: Finished Create User's Volatile Files and Directories.
Dec  1 04:13:19 np0005540741 systemd[76658]: Listening on D-Bus User Message Bus Socket.
Dec  1 04:13:19 np0005540741 systemd[76658]: Reached target Sockets.
Dec  1 04:13:19 np0005540741 systemd[76658]: Reached target Basic System.
Dec  1 04:13:19 np0005540741 systemd[76658]: Reached target Main User Target.
Dec  1 04:13:19 np0005540741 systemd[76658]: Startup finished in 149ms.
Dec  1 04:13:19 np0005540741 systemd[1]: Started User Manager for UID 42477.
Dec  1 04:13:19 np0005540741 systemd[1]: Started Session 21 of User ceph-admin.
Dec  1 04:13:19 np0005540741 systemd[1]: Started Session 23 of User ceph-admin.
Dec  1 04:13:19 np0005540741 systemd-logind[788]: New session 24 of user ceph-admin.
Dec  1 04:13:19 np0005540741 systemd[1]: Started Session 24 of User ceph-admin.
Dec  1 04:13:19 np0005540741 systemd-logind[788]: New session 25 of user ceph-admin.
Dec  1 04:13:19 np0005540741 systemd[1]: Started Session 25 of User ceph-admin.
Dec  1 04:13:20 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Deploying cephadm binary to compute-0
Dec  1 04:13:20 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Deploying cephadm binary to compute-0
Dec  1 04:13:20 np0005540741 systemd-logind[788]: New session 26 of user ceph-admin.
Dec  1 04:13:20 np0005540741 systemd[1]: Started Session 26 of User ceph-admin.
Dec  1 04:13:20 np0005540741 systemd-logind[788]: New session 27 of user ceph-admin.
Dec  1 04:13:20 np0005540741 systemd[1]: Started Session 27 of User ceph-admin.
Dec  1 04:13:20 np0005540741 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec  1 04:13:20 np0005540741 ceph-mon[75031]: Deploying cephadm binary to compute-0
Dec  1 04:13:21 np0005540741 systemd-logind[788]: New session 28 of user ceph-admin.
Dec  1 04:13:21 np0005540741 systemd[1]: Started Session 28 of User ceph-admin.
Dec  1 04:13:21 np0005540741 systemd-logind[788]: New session 29 of user ceph-admin.
Dec  1 04:13:21 np0005540741 systemd[1]: Started Session 29 of User ceph-admin.
Dec  1 04:13:21 np0005540741 systemd-logind[788]: New session 30 of user ceph-admin.
Dec  1 04:13:21 np0005540741 systemd[1]: Started Session 30 of User ceph-admin.
Dec  1 04:13:22 np0005540741 systemd-logind[788]: New session 31 of user ceph-admin.
Dec  1 04:13:22 np0005540741 systemd[1]: Started Session 31 of User ceph-admin.
Dec  1 04:13:22 np0005540741 systemd-logind[788]: New session 32 of user ceph-admin.
Dec  1 04:13:22 np0005540741 systemd[1]: Started Session 32 of User ceph-admin.
Dec  1 04:13:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1020053034 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:13:22 np0005540741 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec  1 04:13:23 np0005540741 systemd-logind[788]: New session 33 of user ceph-admin.
Dec  1 04:13:23 np0005540741 systemd[1]: Started Session 33 of User ceph-admin.
Dec  1 04:13:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) v1
Dec  1 04:13:23 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:23 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Added host compute-0
Dec  1 04:13:23 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Added host compute-0
Dec  1 04:13:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) v1
Dec  1 04:13:23 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec  1 04:13:23 np0005540741 zen_mahavira[76628]: Added host 'compute-0' with addr '192.168.122.100'
Dec  1 04:13:23 np0005540741 systemd[1]: libpod-91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172.scope: Deactivated successfully.
Dec  1 04:13:23 np0005540741 podman[76612]: 2025-12-01 09:13:23.914857286 +0000 UTC m=+5.870183030 container died 91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172 (image=quay.io/ceph/ceph:v18, name=zen_mahavira, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:13:24 np0005540741 systemd[1]: var-lib-containers-storage-overlay-746a7322c464db3f4e8fd9d40f5c3e04af5c9ab61050d1af559f36551205884d-merged.mount: Deactivated successfully.
Dec  1 04:13:24 np0005540741 podman[76612]: 2025-12-01 09:13:24.063419045 +0000 UTC m=+6.018744789 container remove 91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172 (image=quay.io/ceph/ceph:v18, name=zen_mahavira, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec  1 04:13:24 np0005540741 systemd[1]: libpod-conmon-91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172.scope: Deactivated successfully.
Dec  1 04:13:24 np0005540741 podman[77364]: 2025-12-01 09:13:24.130483105 +0000 UTC m=+0.042742539 container create f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94 (image=quay.io/ceph/ceph:v18, name=priceless_herschel, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:13:24 np0005540741 systemd[1]: Started libpod-conmon-f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94.scope.
Dec  1 04:13:24 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9d5a3b5a8d5b9bb446cd4268cb5f458c2969d3fb49469135ba214f3d46b8043/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9d5a3b5a8d5b9bb446cd4268cb5f458c2969d3fb49469135ba214f3d46b8043/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9d5a3b5a8d5b9bb446cd4268cb5f458c2969d3fb49469135ba214f3d46b8043/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:24 np0005540741 podman[77364]: 2025-12-01 09:13:24.199784112 +0000 UTC m=+0.112043576 container init f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94 (image=quay.io/ceph/ceph:v18, name=priceless_herschel, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:13:24 np0005540741 podman[77364]: 2025-12-01 09:13:24.205601844 +0000 UTC m=+0.117861288 container start f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94 (image=quay.io/ceph/ceph:v18, name=priceless_herschel, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:13:24 np0005540741 podman[77364]: 2025-12-01 09:13:24.114868912 +0000 UTC m=+0.027128376 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:24 np0005540741 podman[77364]: 2025-12-01 09:13:24.214491028 +0000 UTC m=+0.126750502 container attach f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94 (image=quay.io/ceph/ceph:v18, name=priceless_herschel, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:13:24 np0005540741 podman[77424]: 2025-12-01 09:13:24.532205937 +0000 UTC m=+0.074000907 container create 6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45 (image=quay.io/ceph/ceph:v18, name=bold_pasteur, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  1 04:13:24 np0005540741 systemd[1]: Started libpod-conmon-6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45.scope.
Dec  1 04:13:24 np0005540741 podman[77424]: 2025-12-01 09:13:24.490607472 +0000 UTC m=+0.032402442 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:24 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:24 np0005540741 podman[77424]: 2025-12-01 09:13:24.604232284 +0000 UTC m=+0.146027284 container init 6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45 (image=quay.io/ceph/ceph:v18, name=bold_pasteur, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec  1 04:13:24 np0005540741 podman[77424]: 2025-12-01 09:13:24.61049828 +0000 UTC m=+0.152293240 container start 6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45 (image=quay.io/ceph/ceph:v18, name=bold_pasteur, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef)
Dec  1 04:13:24 np0005540741 podman[77424]: 2025-12-01 09:13:24.613589352 +0000 UTC m=+0.155384312 container attach 6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45 (image=quay.io/ceph/ceph:v18, name=bold_pasteur, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec  1 04:13:24 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14156 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:13:24 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Saving service mon spec with placement count:5
Dec  1 04:13:24 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service mon spec with placement count:5
Dec  1 04:13:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) v1
Dec  1 04:13:24 np0005540741 bold_pasteur[77459]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable)
Dec  1 04:13:24 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:24 np0005540741 priceless_herschel[77392]: Scheduled mon update...
Dec  1 04:13:24 np0005540741 systemd[1]: libpod-6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45.scope: Deactivated successfully.
Dec  1 04:13:24 np0005540741 podman[77424]: 2025-12-01 09:13:24.924388765 +0000 UTC m=+0.466183735 container died 6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45 (image=quay.io/ceph/ceph:v18, name=bold_pasteur, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:13:24 np0005540741 systemd[1]: libpod-f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94.scope: Deactivated successfully.
Dec  1 04:13:24 np0005540741 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec  1 04:13:25 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:25 np0005540741 ceph-mon[75031]: Added host compute-0
Dec  1 04:13:25 np0005540741 systemd[1]: var-lib-containers-storage-overlay-d16d0e4ee2e4e24f339ac8a406f324b8b3a879513f4537f785e8cbf09f7c7e9b-merged.mount: Deactivated successfully.
Dec  1 04:13:25 np0005540741 podman[77424]: 2025-12-01 09:13:25.460630228 +0000 UTC m=+1.002425198 container remove 6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45 (image=quay.io/ceph/ceph:v18, name=bold_pasteur, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Dec  1 04:13:25 np0005540741 systemd[1]: libpod-conmon-6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45.scope: Deactivated successfully.
Dec  1 04:13:25 np0005540741 podman[77364]: 2025-12-01 09:13:25.475152709 +0000 UTC m=+1.387412153 container died f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94 (image=quay.io/ceph/ceph:v18, name=priceless_herschel, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec  1 04:13:25 np0005540741 systemd[1]: var-lib-containers-storage-overlay-e9d5a3b5a8d5b9bb446cd4268cb5f458c2969d3fb49469135ba214f3d46b8043-merged.mount: Deactivated successfully.
Dec  1 04:13:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=container_image}] v 0) v1
Dec  1 04:13:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:25 np0005540741 podman[77472]: 2025-12-01 09:13:25.514679902 +0000 UTC m=+0.563275436 container remove f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94 (image=quay.io/ceph/ceph:v18, name=priceless_herschel, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec  1 04:13:25 np0005540741 systemd[1]: libpod-conmon-f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94.scope: Deactivated successfully.
Dec  1 04:13:25 np0005540741 podman[77493]: 2025-12-01 09:13:25.574861858 +0000 UTC m=+0.036354490 container create 33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8 (image=quay.io/ceph/ceph:v18, name=youthful_curran, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default)
Dec  1 04:13:25 np0005540741 systemd[1]: Started libpod-conmon-33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8.scope.
Dec  1 04:13:25 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/baf28985a89a43298a0e5fe91f7a6d3f044b1a33233dcfb039654d6560a25169/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/baf28985a89a43298a0e5fe91f7a6d3f044b1a33233dcfb039654d6560a25169/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/baf28985a89a43298a0e5fe91f7a6d3f044b1a33233dcfb039654d6560a25169/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:25 np0005540741 podman[77493]: 2025-12-01 09:13:25.651968937 +0000 UTC m=+0.113461579 container init 33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8 (image=quay.io/ceph/ceph:v18, name=youthful_curran, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec  1 04:13:25 np0005540741 podman[77493]: 2025-12-01 09:13:25.558927695 +0000 UTC m=+0.020420347 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:25 np0005540741 podman[77493]: 2025-12-01 09:13:25.658844711 +0000 UTC m=+0.120337363 container start 33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8 (image=quay.io/ceph/ceph:v18, name=youthful_curran, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  1 04:13:25 np0005540741 podman[77493]: 2025-12-01 09:13:25.66218896 +0000 UTC m=+0.123681602 container attach 33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8 (image=quay.io/ceph/ceph:v18, name=youthful_curran, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec  1 04:13:26 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:13:26 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:26 np0005540741 ceph-mon[75031]: Saving service mon spec with placement count:5
Dec  1 04:13:26 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:26 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:26 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:26 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14158 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:13:26 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Saving service mgr spec with placement count:2
Dec  1 04:13:26 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service mgr spec with placement count:2
Dec  1 04:13:26 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) v1
Dec  1 04:13:26 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:26 np0005540741 youthful_curran[77553]: Scheduled mgr update...
Dec  1 04:13:26 np0005540741 systemd[1]: libpod-33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8.scope: Deactivated successfully.
Dec  1 04:13:26 np0005540741 podman[77493]: 2025-12-01 09:13:26.29602742 +0000 UTC m=+0.757520062 container died 33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8 (image=quay.io/ceph/ceph:v18, name=youthful_curran, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec  1 04:13:26 np0005540741 systemd[1]: var-lib-containers-storage-overlay-baf28985a89a43298a0e5fe91f7a6d3f044b1a33233dcfb039654d6560a25169-merged.mount: Deactivated successfully.
Dec  1 04:13:26 np0005540741 podman[77493]: 2025-12-01 09:13:26.336463669 +0000 UTC m=+0.797956301 container remove 33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8 (image=quay.io/ceph/ceph:v18, name=youthful_curran, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:13:26 np0005540741 systemd[1]: libpod-conmon-33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8.scope: Deactivated successfully.
Dec  1 04:13:26 np0005540741 podman[77771]: 2025-12-01 09:13:26.400457589 +0000 UTC m=+0.043945836 container create c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5 (image=quay.io/ceph/ceph:v18, name=sweet_chebyshev, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Dec  1 04:13:26 np0005540741 systemd[1]: Started libpod-conmon-c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5.scope.
Dec  1 04:13:26 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:26 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f0a310e2443170f6fde92a8407bbdd079f9ac5a5daf5ab09e71ec63a5e04f2c/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:26 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f0a310e2443170f6fde92a8407bbdd079f9ac5a5daf5ab09e71ec63a5e04f2c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:26 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f0a310e2443170f6fde92a8407bbdd079f9ac5a5daf5ab09e71ec63a5e04f2c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:26 np0005540741 podman[77771]: 2025-12-01 09:13:26.474839456 +0000 UTC m=+0.118327743 container init c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5 (image=quay.io/ceph/ceph:v18, name=sweet_chebyshev, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:13:26 np0005540741 podman[77771]: 2025-12-01 09:13:26.382872287 +0000 UTC m=+0.026360554 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:26 np0005540741 podman[77771]: 2025-12-01 09:13:26.482352119 +0000 UTC m=+0.125840366 container start c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5 (image=quay.io/ceph/ceph:v18, name=sweet_chebyshev, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2)
Dec  1 04:13:26 np0005540741 podman[77771]: 2025-12-01 09:13:26.48575276 +0000 UTC m=+0.129241047 container attach c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5 (image=quay.io/ceph/ceph:v18, name=sweet_chebyshev, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:13:26 np0005540741 podman[77864]: 2025-12-01 09:13:26.853306187 +0000 UTC m=+0.161400780 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:13:26 np0005540741 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec  1 04:13:27 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14160 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:13:27 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Saving service crash spec with placement *
Dec  1 04:13:27 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service crash spec with placement *
Dec  1 04:13:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.crash}] v 0) v1
Dec  1 04:13:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:27 np0005540741 sweet_chebyshev[77812]: Scheduled crash update...
Dec  1 04:13:27 np0005540741 systemd[1]: libpod-c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5.scope: Deactivated successfully.
Dec  1 04:13:27 np0005540741 podman[77771]: 2025-12-01 09:13:27.122110234 +0000 UTC m=+0.765598501 container died c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5 (image=quay.io/ceph/ceph:v18, name=sweet_chebyshev, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:13:27 np0005540741 systemd[1]: var-lib-containers-storage-overlay-7f0a310e2443170f6fde92a8407bbdd079f9ac5a5daf5ab09e71ec63a5e04f2c-merged.mount: Deactivated successfully.
Dec  1 04:13:27 np0005540741 podman[77771]: 2025-12-01 09:13:27.169055047 +0000 UTC m=+0.812543294 container remove c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5 (image=quay.io/ceph/ceph:v18, name=sweet_chebyshev, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec  1 04:13:27 np0005540741 systemd[1]: libpod-conmon-c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5.scope: Deactivated successfully.
Dec  1 04:13:27 np0005540741 podman[77864]: 2025-12-01 09:13:27.200687445 +0000 UTC m=+0.508782028 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  1 04:13:27 np0005540741 podman[77919]: 2025-12-01 09:13:27.246976079 +0000 UTC m=+0.057209809 container create 75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd (image=quay.io/ceph/ceph:v18, name=elastic_feynman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 04:13:27 np0005540741 ceph-mon[75031]: Saving service mgr spec with placement count:2
Dec  1 04:13:27 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:27 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:27 np0005540741 systemd[1]: Started libpod-conmon-75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd.scope.
Dec  1 04:13:27 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:27 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1fa88f76e7c2e8d9e95454c4b4b0a75770a75a14c6c6ef50c8ce3abc6e4a0d5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:27 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1fa88f76e7c2e8d9e95454c4b4b0a75770a75a14c6c6ef50c8ce3abc6e4a0d5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:27 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1fa88f76e7c2e8d9e95454c4b4b0a75770a75a14c6c6ef50c8ce3abc6e4a0d5/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:27 np0005540741 podman[77919]: 2025-12-01 09:13:27.223378258 +0000 UTC m=+0.033612008 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:27 np0005540741 podman[77919]: 2025-12-01 09:13:27.319717137 +0000 UTC m=+0.129950887 container init 75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd (image=quay.io/ceph/ceph:v18, name=elastic_feynman, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec  1 04:13:27 np0005540741 podman[77919]: 2025-12-01 09:13:27.327670733 +0000 UTC m=+0.137904463 container start 75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd (image=quay.io/ceph/ceph:v18, name=elastic_feynman, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:13:27 np0005540741 podman[77919]: 2025-12-01 09:13:27.331079774 +0000 UTC m=+0.141313504 container attach 75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd (image=quay.io/ceph/ceph:v18, name=elastic_feynman, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec  1 04:13:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:13:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:27 np0005540741 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 78098 (sysctl)
Dec  1 04:13:27 np0005540741 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec  1 04:13:27 np0005540741 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec  1 04:13:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mgr/cephadm/container_init}] v 0) v1
Dec  1 04:13:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2604732978' entity='client.admin' 
Dec  1 04:13:27 np0005540741 systemd[1]: libpod-75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd.scope: Deactivated successfully.
Dec  1 04:13:27 np0005540741 podman[77919]: 2025-12-01 09:13:27.888816196 +0000 UTC m=+0.699049926 container died 75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd (image=quay.io/ceph/ceph:v18, name=elastic_feynman, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec  1 04:13:27 np0005540741 systemd[1]: var-lib-containers-storage-overlay-a1fa88f76e7c2e8d9e95454c4b4b0a75770a75a14c6c6ef50c8ce3abc6e4a0d5-merged.mount: Deactivated successfully.
Dec  1 04:13:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1020054710 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:13:27 np0005540741 podman[77919]: 2025-12-01 09:13:27.934200773 +0000 UTC m=+0.744434503 container remove 75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd (image=quay.io/ceph/ceph:v18, name=elastic_feynman, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Dec  1 04:13:27 np0005540741 systemd[1]: libpod-conmon-75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd.scope: Deactivated successfully.
Dec  1 04:13:27 np0005540741 podman[78123]: 2025-12-01 09:13:27.996407019 +0000 UTC m=+0.042648367 container create 4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e (image=quay.io/ceph/ceph:v18, name=serene_diffie, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Dec  1 04:13:28 np0005540741 systemd[1]: Started libpod-conmon-4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e.scope.
Dec  1 04:13:28 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:28 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb4441c3f005973e0ad79d1c4bb1e5aa518d2e4ee346631f4cbfde85eab690d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:28 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb4441c3f005973e0ad79d1c4bb1e5aa518d2e4ee346631f4cbfde85eab690d/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:28 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb4441c3f005973e0ad79d1c4bb1e5aa518d2e4ee346631f4cbfde85eab690d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:28 np0005540741 podman[78123]: 2025-12-01 09:13:27.977770266 +0000 UTC m=+0.024011634 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:28 np0005540741 podman[78123]: 2025-12-01 09:13:28.077694621 +0000 UTC m=+0.123935989 container init 4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e (image=quay.io/ceph/ceph:v18, name=serene_diffie, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Dec  1 04:13:28 np0005540741 podman[78123]: 2025-12-01 09:13:28.083542184 +0000 UTC m=+0.129783532 container start 4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e (image=quay.io/ceph/ceph:v18, name=serene_diffie, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default)
Dec  1 04:13:28 np0005540741 podman[78123]: 2025-12-01 09:13:28.086629946 +0000 UTC m=+0.132871324 container attach 4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e (image=quay.io/ceph/ceph:v18, name=serene_diffie, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:13:28 np0005540741 ceph-mon[75031]: Saving service crash spec with placement *
Dec  1 04:13:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:28 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/2604732978' entity='client.admin' 
Dec  1 04:13:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:13:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:28 np0005540741 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec  1 04:13:29 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14164 -' entity='client.admin' cmd=[{"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "label:_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:13:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/client_keyrings}] v 0) v1
Dec  1 04:13:29 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:29 np0005540741 systemd[1]: libpod-4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e.scope: Deactivated successfully.
Dec  1 04:13:29 np0005540741 podman[78123]: 2025-12-01 09:13:29.030739703 +0000 UTC m=+1.076981081 container died 4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e (image=quay.io/ceph/ceph:v18, name=serene_diffie, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Dec  1 04:13:29 np0005540741 systemd[1]: var-lib-containers-storage-overlay-dbb4441c3f005973e0ad79d1c4bb1e5aa518d2e4ee346631f4cbfde85eab690d-merged.mount: Deactivated successfully.
Dec  1 04:13:29 np0005540741 systemd[1]: libpod-conmon-4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e.scope: Deactivated successfully.
Dec  1 04:13:29 np0005540741 podman[78123]: 2025-12-01 09:13:29.065886446 +0000 UTC m=+1.112127794 container remove 4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e (image=quay.io/ceph/ceph:v18, name=serene_diffie, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:13:29 np0005540741 podman[78407]: 2025-12-01 09:13:29.136130171 +0000 UTC m=+0.042494312 container create dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d (image=quay.io/ceph/ceph:v18, name=recursing_zhukovsky, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:13:29 np0005540741 systemd[1]: Started libpod-conmon-dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d.scope.
Dec  1 04:13:29 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:29 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f18712edbbe4e21ac9262f112543cee5b176447de534cf232d886e12829dd089/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:29 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f18712edbbe4e21ac9262f112543cee5b176447de534cf232d886e12829dd089/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:29 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f18712edbbe4e21ac9262f112543cee5b176447de534cf232d886e12829dd089/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:29 np0005540741 podman[78407]: 2025-12-01 09:13:29.211562199 +0000 UTC m=+0.117926360 container init dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d (image=quay.io/ceph/ceph:v18, name=recursing_zhukovsky, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2)
Dec  1 04:13:29 np0005540741 podman[78407]: 2025-12-01 09:13:29.117331383 +0000 UTC m=+0.023695544 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:29 np0005540741 podman[78407]: 2025-12-01 09:13:29.217376982 +0000 UTC m=+0.123741123 container start dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d (image=quay.io/ceph/ceph:v18, name=recursing_zhukovsky, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:13:29 np0005540741 podman[78407]: 2025-12-01 09:13:29.22034118 +0000 UTC m=+0.126705321 container attach dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d (image=quay.io/ceph/ceph:v18, name=recursing_zhukovsky, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:13:29 np0005540741 podman[78467]: 2025-12-01 09:13:29.379431681 +0000 UTC m=+0.083098877 container create ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_rubin, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Dec  1 04:13:29 np0005540741 podman[78467]: 2025-12-01 09:13:29.316345379 +0000 UTC m=+0.020012595 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:13:29 np0005540741 systemd[1]: Started libpod-conmon-ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202.scope.
Dec  1 04:13:29 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:29 np0005540741 podman[78467]: 2025-12-01 09:13:29.481874411 +0000 UTC m=+0.185541647 container init ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_rubin, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  1 04:13:29 np0005540741 podman[78467]: 2025-12-01 09:13:29.488362963 +0000 UTC m=+0.192030159 container start ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_rubin, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  1 04:13:29 np0005540741 podman[78467]: 2025-12-01 09:13:29.491673122 +0000 UTC m=+0.195340438 container attach ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_rubin, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:13:29 np0005540741 busy_rubin[78483]: 167 167
Dec  1 04:13:29 np0005540741 systemd[1]: libpod-ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202.scope: Deactivated successfully.
Dec  1 04:13:29 np0005540741 podman[78467]: 2025-12-01 09:13:29.49633973 +0000 UTC m=+0.200006916 container died ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_rubin, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec  1 04:13:29 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b2a129378f30c0619c656d87793c2ebf33008bb8fbeec51482add953933b6d3d-merged.mount: Deactivated successfully.
Dec  1 04:13:29 np0005540741 podman[78467]: 2025-12-01 09:13:29.535782101 +0000 UTC m=+0.239449297 container remove ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_rubin, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec  1 04:13:29 np0005540741 systemd[1]: libpod-conmon-ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202.scope: Deactivated successfully.
Dec  1 04:13:29 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:29 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:29 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14166 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "compute-0", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:13:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) v1
Dec  1 04:13:29 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:29 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Added label _admin to host compute-0
Dec  1 04:13:29 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Added label _admin to host compute-0
Dec  1 04:13:29 np0005540741 recursing_zhukovsky[78441]: Added label _admin to host compute-0
Dec  1 04:13:29 np0005540741 systemd[1]: libpod-dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d.scope: Deactivated successfully.
Dec  1 04:13:29 np0005540741 podman[78407]: 2025-12-01 09:13:29.843160612 +0000 UTC m=+0.749524753 container died dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d (image=quay.io/ceph/ceph:v18, name=recursing_zhukovsky, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:13:29 np0005540741 systemd[1]: var-lib-containers-storage-overlay-f18712edbbe4e21ac9262f112543cee5b176447de534cf232d886e12829dd089-merged.mount: Deactivated successfully.
Dec  1 04:13:29 np0005540741 podman[78407]: 2025-12-01 09:13:29.938485591 +0000 UTC m=+0.844849732 container remove dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d (image=quay.io/ceph/ceph:v18, name=recursing_zhukovsky, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 04:13:29 np0005540741 systemd[1]: libpod-conmon-dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d.scope: Deactivated successfully.
Dec  1 04:13:30 np0005540741 podman[78534]: 2025-12-01 09:13:30.069409697 +0000 UTC m=+0.111780059 container create 08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64 (image=quay.io/ceph/ceph:v18, name=xenodochial_brattain, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  1 04:13:30 np0005540741 podman[78534]: 2025-12-01 09:13:29.979684544 +0000 UTC m=+0.022054926 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:30 np0005540741 systemd[1]: Started libpod-conmon-08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64.scope.
Dec  1 04:13:30 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3770ec82946ea8cac3e6025ee0a2994907d5bd66be6f52c2009a25743afee56/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3770ec82946ea8cac3e6025ee0a2994907d5bd66be6f52c2009a25743afee56/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3770ec82946ea8cac3e6025ee0a2994907d5bd66be6f52c2009a25743afee56/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:30 np0005540741 podman[78534]: 2025-12-01 09:13:30.138201508 +0000 UTC m=+0.180571890 container init 08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64 (image=quay.io/ceph/ceph:v18, name=xenodochial_brattain, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:13:30 np0005540741 podman[78534]: 2025-12-01 09:13:30.144382241 +0000 UTC m=+0.186752603 container start 08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64 (image=quay.io/ceph/ceph:v18, name=xenodochial_brattain, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec  1 04:13:30 np0005540741 podman[78534]: 2025-12-01 09:13:30.148066521 +0000 UTC m=+0.190436913 container attach 08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64 (image=quay.io/ceph/ceph:v18, name=xenodochial_brattain, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec  1 04:13:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=osd_memory_target_autotune}] v 0) v1
Dec  1 04:13:30 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/729133558' entity='client.admin' 
Dec  1 04:13:30 np0005540741 systemd[1]: libpod-08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64.scope: Deactivated successfully.
Dec  1 04:13:30 np0005540741 podman[78534]: 2025-12-01 09:13:30.750662333 +0000 UTC m=+0.793032695 container died 08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64 (image=quay.io/ceph/ceph:v18, name=xenodochial_brattain, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:13:30 np0005540741 systemd[1]: var-lib-containers-storage-overlay-c3770ec82946ea8cac3e6025ee0a2994907d5bd66be6f52c2009a25743afee56-merged.mount: Deactivated successfully.
Dec  1 04:13:30 np0005540741 podman[78534]: 2025-12-01 09:13:30.790507555 +0000 UTC m=+0.832877917 container remove 08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64 (image=quay.io/ceph/ceph:v18, name=xenodochial_brattain, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec  1 04:13:30 np0005540741 systemd[1]: libpod-conmon-08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64.scope: Deactivated successfully.
Dec  1 04:13:30 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:30 np0005540741 ceph-mon[75031]: Added label _admin to host compute-0
Dec  1 04:13:30 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/729133558' entity='client.admin' 
Dec  1 04:13:30 np0005540741 podman[78590]: 2025-12-01 09:13:30.866482649 +0000 UTC m=+0.057564669 container create a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c (image=quay.io/ceph/ceph:v18, name=stupefied_feistel, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec  1 04:13:30 np0005540741 systemd[1]: Started libpod-conmon-a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c.scope.
Dec  1 04:13:30 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b761eeb2c6a6db13713e4031d484c943271247c27b8d78e5eea9a9389140da44/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b761eeb2c6a6db13713e4031d484c943271247c27b8d78e5eea9a9389140da44/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b761eeb2c6a6db13713e4031d484c943271247c27b8d78e5eea9a9389140da44/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:30 np0005540741 podman[78590]: 2025-12-01 09:13:30.84124493 +0000 UTC m=+0.032327020 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:30 np0005540741 podman[78590]: 2025-12-01 09:13:30.939933729 +0000 UTC m=+0.131015719 container init a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c (image=quay.io/ceph/ceph:v18, name=stupefied_feistel, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:13:30 np0005540741 podman[78590]: 2025-12-01 09:13:30.945056121 +0000 UTC m=+0.136138111 container start a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c (image=quay.io/ceph/ceph:v18, name=stupefied_feistel, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  1 04:13:30 np0005540741 podman[78590]: 2025-12-01 09:13:30.948248616 +0000 UTC m=+0.139330626 container attach a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c (image=quay.io/ceph/ceph:v18, name=stupefied_feistel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:13:30 np0005540741 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec  1 04:13:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/dashboard/cluster/status}] v 0) v1
Dec  1 04:13:31 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3188111268' entity='client.admin' 
Dec  1 04:13:31 np0005540741 stupefied_feistel[78607]: set mgr/dashboard/cluster/status
Dec  1 04:13:31 np0005540741 systemd[1]: libpod-a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c.scope: Deactivated successfully.
Dec  1 04:13:31 np0005540741 podman[78590]: 2025-12-01 09:13:31.664709276 +0000 UTC m=+0.855791276 container died a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c (image=quay.io/ceph/ceph:v18, name=stupefied_feistel, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:13:31 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b761eeb2c6a6db13713e4031d484c943271247c27b8d78e5eea9a9389140da44-merged.mount: Deactivated successfully.
Dec  1 04:13:31 np0005540741 podman[78590]: 2025-12-01 09:13:31.706223418 +0000 UTC m=+0.897305408 container remove a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c (image=quay.io/ceph/ceph:v18, name=stupefied_feistel, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec  1 04:13:31 np0005540741 systemd[1]: libpod-conmon-a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c.scope: Deactivated successfully.
Dec  1 04:13:31 np0005540741 podman[78654]: 2025-12-01 09:13:31.91552979 +0000 UTC m=+0.044991036 container create 1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_jang, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:13:31 np0005540741 systemd[1]: Started libpod-conmon-1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199.scope.
Dec  1 04:13:31 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10d2f33fd1b0c2fe7fb20b41c94ea34e3bd0c096ee73f6aeadddfa08650a78c1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10d2f33fd1b0c2fe7fb20b41c94ea34e3bd0c096ee73f6aeadddfa08650a78c1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10d2f33fd1b0c2fe7fb20b41c94ea34e3bd0c096ee73f6aeadddfa08650a78c1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10d2f33fd1b0c2fe7fb20b41c94ea34e3bd0c096ee73f6aeadddfa08650a78c1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:31 np0005540741 podman[78654]: 2025-12-01 09:13:31.981012554 +0000 UTC m=+0.110473780 container init 1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_jang, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:13:31 np0005540741 podman[78654]: 2025-12-01 09:13:31.99066353 +0000 UTC m=+0.120124736 container start 1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_jang, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec  1 04:13:31 np0005540741 podman[78654]: 2025-12-01 09:13:31.897781834 +0000 UTC m=+0.027243060 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:13:31 np0005540741 podman[78654]: 2025-12-01 09:13:31.994317269 +0000 UTC m=+0.123778585 container attach 1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_jang, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:13:32 np0005540741 python3[78701]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   config set mgr mgr/cephadm/use_repo_digest false#012 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:13:32 np0005540741 podman[78702]: 2025-12-01 09:13:32.264530447 +0000 UTC m=+0.050630643 container create 99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea (image=quay.io/ceph/ceph:v18, name=optimistic_cray, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:13:32 np0005540741 systemd[1]: Started libpod-conmon-99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea.scope.
Dec  1 04:13:32 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:32 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6983d1ee366207ebadf544cff5285b0f5378145fefc093623813b904edb3e89/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:32 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6983d1ee366207ebadf544cff5285b0f5378145fefc093623813b904edb3e89/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:32 np0005540741 podman[78702]: 2025-12-01 09:13:32.332572376 +0000 UTC m=+0.118672562 container init 99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea (image=quay.io/ceph/ceph:v18, name=optimistic_cray, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:13:32 np0005540741 podman[78702]: 2025-12-01 09:13:32.245481742 +0000 UTC m=+0.031581948 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:32 np0005540741 podman[78702]: 2025-12-01 09:13:32.341364297 +0000 UTC m=+0.127464493 container start 99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea (image=quay.io/ceph/ceph:v18, name=optimistic_cray, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec  1 04:13:32 np0005540741 podman[78702]: 2025-12-01 09:13:32.344699716 +0000 UTC m=+0.130799912 container attach 99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea (image=quay.io/ceph/ceph:v18, name=optimistic_cray, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec  1 04:13:32 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/3188111268' entity='client.admin' 
Dec  1 04:13:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mgr/cephadm/use_repo_digest}] v 0) v1
Dec  1 04:13:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1972745556' entity='client.admin' 
Dec  1 04:13:32 np0005540741 systemd[1]: libpod-99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea.scope: Deactivated successfully.
Dec  1 04:13:32 np0005540741 podman[78702]: 2025-12-01 09:13:32.911040683 +0000 UTC m=+0.697140879 container died 99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea (image=quay.io/ceph/ceph:v18, name=optimistic_cray, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3)
Dec  1 04:13:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:13:32 np0005540741 systemd[1]: var-lib-containers-storage-overlay-d6983d1ee366207ebadf544cff5285b0f5378145fefc093623813b904edb3e89-merged.mount: Deactivated successfully.
Dec  1 04:13:32 np0005540741 ceph-mgr[75324]: mgr.server send_report Giving up on OSDs that haven't reported yet, sending potentially incomplete PG state to mon
Dec  1 04:13:32 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:13:32 np0005540741 ceph-mon[75031]: log_channel(cluster) log [WRN] : Health check failed: OSD count 0 < osd_pool_default_size 1 (TOO_FEW_OSDS)
Dec  1 04:13:32 np0005540741 podman[78702]: 2025-12-01 09:13:32.956138861 +0000 UTC m=+0.742239057 container remove 99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea (image=quay.io/ceph/ceph:v18, name=optimistic_cray, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507)
Dec  1 04:13:32 np0005540741 systemd[1]: libpod-conmon-99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea.scope: Deactivated successfully.
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]: [
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:    {
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:        "available": false,
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:        "ceph_device": false,
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:        "device_id": "QEMU_DVD-ROM_QM00001",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:        "lsm_data": {},
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:        "lvs": [],
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:        "path": "/dev/sr0",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:        "rejected_reasons": [
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "Has a FileSystem",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "Insufficient space (<5GB)"
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:        ],
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:        "sys_api": {
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "actuators": null,
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "device_nodes": "sr0",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "devname": "sr0",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "human_readable_size": "482.00 KB",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "id_bus": "ata",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "model": "QEMU DVD-ROM",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "nr_requests": "2",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "parent": "/dev/sr0",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "partitions": {},
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "path": "/dev/sr0",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "removable": "1",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "rev": "2.5+",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "ro": "0",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "rotational": "1",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "sas_address": "",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "sas_device_handle": "",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "scheduler_mode": "mq-deadline",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "sectors": 0,
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "sectorsize": "2048",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "size": 493568.0,
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "support_discard": "2048",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "type": "disk",
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:            "vendor": "QEMU"
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:        }
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]:    }
Dec  1 04:13:33 np0005540741 beautiful_jang[78671]: ]
Dec  1 04:13:33 np0005540741 systemd[1]: libpod-1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199.scope: Deactivated successfully.
Dec  1 04:13:33 np0005540741 podman[78654]: 2025-12-01 09:13:33.345518536 +0000 UTC m=+1.474979772 container died 1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_jang, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:13:33 np0005540741 systemd[1]: libpod-1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199.scope: Consumed 1.382s CPU time.
Dec  1 04:13:33 np0005540741 systemd[1]: var-lib-containers-storage-overlay-10d2f33fd1b0c2fe7fb20b41c94ea34e3bd0c096ee73f6aeadddfa08650a78c1-merged.mount: Deactivated successfully.
Dec  1 04:13:33 np0005540741 podman[78654]: 2025-12-01 09:13:33.400484967 +0000 UTC m=+1.529946183 container remove 1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_jang, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:13:33 np0005540741 systemd[1]: libpod-conmon-1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199.scope: Deactivated successfully.
Dec  1 04:13:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:13:33 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:13:33 np0005540741 ansible-async_wrapper.py[80458]: Invoked with j466498949925 30 /home/zuul/.ansible/tmp/ansible-tmp-1764580413.3065505-36368-230320539451562/AnsiballZ_command.py _
Dec  1 04:13:33 np0005540741 ansible-async_wrapper.py[80461]: Starting module and watcher
Dec  1 04:13:33 np0005540741 ansible-async_wrapper.py[80461]: Start watching 80462 (30)
Dec  1 04:13:33 np0005540741 ansible-async_wrapper.py[80462]: Start module (80462)
Dec  1 04:13:33 np0005540741 ansible-async_wrapper.py[80458]: Return async_wrapper task started.
Dec  1 04:13:33 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:13:34 np0005540741 python3[80463]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch status --format json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:13:34 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/1972745556' entity='client.admin' 
Dec  1 04:13:34 np0005540741 ceph-mon[75031]: Health check failed: OSD count 0 < osd_pool_default_size 1 (TOO_FEW_OSDS)
Dec  1 04:13:34 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:34 np0005540741 podman[80464]: 2025-12-01 09:13:34.217122572 +0000 UTC m=+0.106162222 container create 431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7 (image=quay.io/ceph/ceph:v18, name=friendly_swirles, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True)
Dec  1 04:13:34 np0005540741 podman[80464]: 2025-12-01 09:13:34.140827237 +0000 UTC m=+0.029866897 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:13:34 np0005540741 systemd[1]: Started libpod-conmon-431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7.scope.
Dec  1 04:13:34 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:34 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca6df8fd25eab84d8dbfff5186db2a61eb7ccbfb0ca0820342e9a0a09a305aa2/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:34 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca6df8fd25eab84d8dbfff5186db2a61eb7ccbfb0ca0820342e9a0a09a305aa2/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} v 0) v1
Dec  1 04:13:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 04:13:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:13:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:13:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:13:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:13:34 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Updating compute-0:/etc/ceph/ceph.conf
Dec  1 04:13:34 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Updating compute-0:/etc/ceph/ceph.conf
Dec  1 04:13:34 np0005540741 podman[80464]: 2025-12-01 09:13:34.719911421 +0000 UTC m=+0.608951091 container init 431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7 (image=quay.io/ceph/ceph:v18, name=friendly_swirles, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:13:34 np0005540741 podman[80464]: 2025-12-01 09:13:34.73133859 +0000 UTC m=+0.620378240 container start 431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7 (image=quay.io/ceph/ceph:v18, name=friendly_swirles, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:13:34 np0005540741 podman[80464]: 2025-12-01 09:13:34.735078111 +0000 UTC m=+0.624117811 container attach 431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7 (image=quay.io/ceph/ceph:v18, name=friendly_swirles, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:13:34 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:13:35 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:35 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:35 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:35 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 04:13:35 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:13:35 np0005540741 ceph-mon[75031]: Updating compute-0:/etc/ceph/ceph.conf
Dec  1 04:13:35 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14174 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec  1 04:13:35 np0005540741 friendly_swirles[80479]: 
Dec  1 04:13:35 np0005540741 friendly_swirles[80479]: {"available": true, "backend": "cephadm", "paused": false, "workers": 10}
Dec  1 04:13:35 np0005540741 systemd[1]: libpod-431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7.scope: Deactivated successfully.
Dec  1 04:13:35 np0005540741 podman[80464]: 2025-12-01 09:13:35.306509589 +0000 UTC m=+1.195549239 container died 431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7 (image=quay.io/ceph/ceph:v18, name=friendly_swirles, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec  1 04:13:35 np0005540741 systemd[1]: var-lib-containers-storage-overlay-ca6df8fd25eab84d8dbfff5186db2a61eb7ccbfb0ca0820342e9a0a09a305aa2-merged.mount: Deactivated successfully.
Dec  1 04:13:35 np0005540741 podman[80464]: 2025-12-01 09:13:35.348171615 +0000 UTC m=+1.237211265 container remove 431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7 (image=quay.io/ceph/ceph:v18, name=friendly_swirles, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default)
Dec  1 04:13:35 np0005540741 systemd[1]: libpod-conmon-431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7.scope: Deactivated successfully.
Dec  1 04:13:35 np0005540741 ansible-async_wrapper.py[80462]: Module complete (80462)
Dec  1 04:13:35 np0005540741 python3[80794]: ansible-ansible.legacy.async_status Invoked with jid=j466498949925.80458 mode=status _async_dir=/root/.ansible_async
Dec  1 04:13:35 np0005540741 python3[80983]: ansible-ansible.legacy.async_status Invoked with jid=j466498949925.80458 mode=cleanup _async_dir=/root/.ansible_async
Dec  1 04:13:35 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Updating compute-0:/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.conf
Dec  1 04:13:35 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Updating compute-0:/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.conf
Dec  1 04:13:36 np0005540741 python3[81185]: ansible-ansible.builtin.stat Invoked with path=/home/ceph-admin/specs/ceph_spec.yaml follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  1 04:13:36 np0005540741 ceph-mon[75031]: Updating compute-0:/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.conf
Dec  1 04:13:36 np0005540741 python3[81410]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch status --format json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:13:36 np0005540741 podman[81455]: 2025-12-01 09:13:36.750558682 +0000 UTC m=+0.081219681 container create b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e (image=quay.io/ceph/ceph:v18, name=recursing_albattani, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:13:36 np0005540741 podman[81455]: 2025-12-01 09:13:36.696881309 +0000 UTC m=+0.027542328 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:36 np0005540741 systemd[1]: Started libpod-conmon-b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e.scope.
Dec  1 04:13:36 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29c84533923f6c1f5f9c8b792aced0a8831f0c78cc76d6133f2fd02c9d28aa8a/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29c84533923f6c1f5f9c8b792aced0a8831f0c78cc76d6133f2fd02c9d28aa8a/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29c84533923f6c1f5f9c8b792aced0a8831f0c78cc76d6133f2fd02c9d28aa8a/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:36 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:13:36 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:13:36 np0005540741 podman[81455]: 2025-12-01 09:13:36.887437554 +0000 UTC m=+0.218098573 container init b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e (image=quay.io/ceph/ceph:v18, name=recursing_albattani, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:13:36 np0005540741 podman[81455]: 2025-12-01 09:13:36.893675869 +0000 UTC m=+0.224336868 container start b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e (image=quay.io/ceph/ceph:v18, name=recursing_albattani, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec  1 04:13:36 np0005540741 podman[81455]: 2025-12-01 09:13:36.900634626 +0000 UTC m=+0.231295625 container attach b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e (image=quay.io/ceph/ceph:v18, name=recursing_albattani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  1 04:13:36 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:13:37 np0005540741 ceph-mon[75031]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:13:37 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14176 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec  1 04:13:37 np0005540741 recursing_albattani[81527]: 
Dec  1 04:13:37 np0005540741 recursing_albattani[81527]: {"available": true, "backend": "cephadm", "paused": false, "workers": 10}
Dec  1 04:13:37 np0005540741 systemd[1]: libpod-b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e.scope: Deactivated successfully.
Dec  1 04:13:37 np0005540741 podman[81455]: 2025-12-01 09:13:37.512111192 +0000 UTC m=+0.842772191 container died b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e (image=quay.io/ceph/ceph:v18, name=recursing_albattani, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec  1 04:13:37 np0005540741 systemd[1]: var-lib-containers-storage-overlay-29c84533923f6c1f5f9c8b792aced0a8831f0c78cc76d6133f2fd02c9d28aa8a-merged.mount: Deactivated successfully.
Dec  1 04:13:37 np0005540741 podman[81455]: 2025-12-01 09:13:37.617830059 +0000 UTC m=+0.948491058 container remove b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e (image=quay.io/ceph/ceph:v18, name=recursing_albattani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:13:37 np0005540741 systemd[1]: libpod-conmon-b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e.scope: Deactivated successfully.
Dec  1 04:13:37 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:13:37 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Updating compute-0:/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.client.admin.keyring
Dec  1 04:13:37 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Updating compute-0:/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.client.admin.keyring
Dec  1 04:13:38 np0005540741 python3[82005]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   config set global log_to_file true _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:13:38 np0005540741 podman[82055]: 2025-12-01 09:13:38.107100968 +0000 UTC m=+0.040876014 container create dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3 (image=quay.io/ceph/ceph:v18, name=upbeat_kepler, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:13:38 np0005540741 systemd[1]: Started libpod-conmon-dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3.scope.
Dec  1 04:13:38 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb02d7ccbdb527d7a6b098b61835ba4e3f9adfa4f9f7758f30a636b30e3db9f9/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb02d7ccbdb527d7a6b098b61835ba4e3f9adfa4f9f7758f30a636b30e3db9f9/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb02d7ccbdb527d7a6b098b61835ba4e3f9adfa4f9f7758f30a636b30e3db9f9/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:38 np0005540741 podman[82055]: 2025-12-01 09:13:38.089970659 +0000 UTC m=+0.023745725 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:38 np0005540741 podman[82055]: 2025-12-01 09:13:38.212592108 +0000 UTC m=+0.146367174 container init dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3 (image=quay.io/ceph/ceph:v18, name=upbeat_kepler, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:13:38 np0005540741 podman[82055]: 2025-12-01 09:13:38.222182613 +0000 UTC m=+0.155957659 container start dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3 (image=quay.io/ceph/ceph:v18, name=upbeat_kepler, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3)
Dec  1 04:13:38 np0005540741 podman[82055]: 2025-12-01 09:13:38.264857319 +0000 UTC m=+0.198632385 container attach dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3 (image=quay.io/ceph/ceph:v18, name=upbeat_kepler, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:13:38 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=log_to_file}] v 0) v1
Dec  1 04:13:38 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2416352962' entity='client.admin' 
Dec  1 04:13:38 np0005540741 systemd[1]: libpod-dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3.scope: Deactivated successfully.
Dec  1 04:13:38 np0005540741 podman[82055]: 2025-12-01 09:13:38.836939746 +0000 UTC m=+0.770714792 container died dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3 (image=quay.io/ceph/ceph:v18, name=upbeat_kepler, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:13:38 np0005540741 systemd[1]: var-lib-containers-storage-overlay-cb02d7ccbdb527d7a6b098b61835ba4e3f9adfa4f9f7758f30a636b30e3db9f9-merged.mount: Deactivated successfully.
Dec  1 04:13:38 np0005540741 podman[82055]: 2025-12-01 09:13:38.893113063 +0000 UTC m=+0.826888109 container remove dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3 (image=quay.io/ceph/ceph:v18, name=upbeat_kepler, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:13:38 np0005540741 systemd[1]: libpod-conmon-dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3.scope: Deactivated successfully.
Dec  1 04:13:38 np0005540741 ansible-async_wrapper.py[80461]: Done in kid B.
Dec  1 04:13:38 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:39 np0005540741 ceph-mgr[75324]: [progress INFO root] update: starting ev c5abd3f8-9653-4f8e-81e4-4aa8afc043ca (Updating crash deployment (+1 -> 1))
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) v1
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:13:39 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Deploying daemon crash.compute-0 on compute-0
Dec  1 04:13:39 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Deploying daemon crash.compute-0 on compute-0
Dec  1 04:13:39 np0005540741 python3[82498]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   config set global mon_cluster_log_to_file true _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:13:39 np0005540741 podman[82548]: 2025-12-01 09:13:39.276121239 +0000 UTC m=+0.051305634 container create 58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189 (image=quay.io/ceph/ceph:v18, name=sleepy_liskov, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Dec  1 04:13:39 np0005540741 systemd[1]: Started libpod-conmon-58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189.scope.
Dec  1 04:13:39 np0005540741 podman[82548]: 2025-12-01 09:13:39.251714505 +0000 UTC m=+0.026898900 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:39 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:39 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e53a8e5acef89a6971d5bb616e16281af2a9a02b114718d991e587846bcd9ea/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:39 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e53a8e5acef89a6971d5bb616e16281af2a9a02b114718d991e587846bcd9ea/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:39 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e53a8e5acef89a6971d5bb616e16281af2a9a02b114718d991e587846bcd9ea/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:39 np0005540741 podman[82548]: 2025-12-01 09:13:39.37017936 +0000 UTC m=+0.145363765 container init 58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189 (image=quay.io/ceph/ceph:v18, name=sleepy_liskov, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec  1 04:13:39 np0005540741 podman[82548]: 2025-12-01 09:13:39.381728103 +0000 UTC m=+0.156912488 container start 58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189 (image=quay.io/ceph/ceph:v18, name=sleepy_liskov, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec  1 04:13:39 np0005540741 podman[82548]: 2025-12-01 09:13:39.389514764 +0000 UTC m=+0.164699249 container attach 58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189 (image=quay.io/ceph/ceph:v18, name=sleepy_liskov, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec  1 04:13:39 np0005540741 podman[82691]: 2025-12-01 09:13:39.813782674 +0000 UTC m=+0.047879651 container create 863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_beaver, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: Updating compute-0:/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.client.admin.keyring
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/2416352962' entity='client.admin' 
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Dec  1 04:13:39 np0005540741 systemd[1]: Started libpod-conmon-863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed.scope.
Dec  1 04:13:39 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:39 np0005540741 podman[82691]: 2025-12-01 09:13:39.880810754 +0000 UTC m=+0.114907761 container init 863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_beaver, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:13:39 np0005540741 podman[82691]: 2025-12-01 09:13:39.885483892 +0000 UTC m=+0.119580869 container start 863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_beaver, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec  1 04:13:39 np0005540741 objective_beaver[82715]: 167 167
Dec  1 04:13:39 np0005540741 podman[82691]: 2025-12-01 09:13:39.794251915 +0000 UTC m=+0.028348912 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:13:39 np0005540741 systemd[1]: libpod-863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed.scope: Deactivated successfully.
Dec  1 04:13:39 np0005540741 podman[82691]: 2025-12-01 09:13:39.903403054 +0000 UTC m=+0.137500031 container attach 863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_beaver, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec  1 04:13:39 np0005540741 podman[82691]: 2025-12-01 09:13:39.903854937 +0000 UTC m=+0.137951914 container died 863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_beaver, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:13:39 np0005540741 systemd[1]: var-lib-containers-storage-overlay-f97932927680a395270ee6f38245217f3c971a29c5dea2216bcb4485ca36a553-merged.mount: Deactivated successfully.
Dec  1 04:13:39 np0005540741 podman[82691]: 2025-12-01 09:13:39.943601867 +0000 UTC m=+0.177698834 container remove 863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_beaver, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec  1 04:13:39 np0005540741 systemd[1]: libpod-conmon-863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed.scope: Deactivated successfully.
Dec  1 04:13:39 np0005540741 systemd[1]: Reloading.
Dec  1 04:13:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mon_cluster_log_to_file}] v 0) v1
Dec  1 04:13:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/592707641' entity='client.admin' 
Dec  1 04:13:40 np0005540741 podman[82548]: 2025-12-01 09:13:40.03872159 +0000 UTC m=+0.813905975 container died 58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189 (image=quay.io/ceph/ceph:v18, name=sleepy_liskov, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:13:40 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:13:40 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:13:40 np0005540741 systemd[1]: libpod-58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189.scope: Deactivated successfully.
Dec  1 04:13:40 np0005540741 systemd[1]: var-lib-containers-storage-overlay-7e53a8e5acef89a6971d5bb616e16281af2a9a02b114718d991e587846bcd9ea-merged.mount: Deactivated successfully.
Dec  1 04:13:40 np0005540741 podman[82548]: 2025-12-01 09:13:40.289646766 +0000 UTC m=+1.064831151 container remove 58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189 (image=quay.io/ceph/ceph:v18, name=sleepy_liskov, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:13:40 np0005540741 systemd[1]: libpod-conmon-58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189.scope: Deactivated successfully.
Dec  1 04:13:40 np0005540741 systemd[1]: Reloading.
Dec  1 04:13:40 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:13:40 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:13:40 np0005540741 systemd[1]: Starting Ceph crash.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec  1 04:13:40 np0005540741 python3[82852]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd set-require-min-compat-client mimic#012 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:13:40 np0005540741 podman[82878]: 2025-12-01 09:13:40.786096649 +0000 UTC m=+0.052315434 container create 3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50 (image=quay.io/ceph/ceph:v18, name=gallant_bassi, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec  1 04:13:40 np0005540741 systemd[1]: Started libpod-conmon-3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50.scope.
Dec  1 04:13:40 np0005540741 ceph-mon[75031]: Deploying daemon crash.compute-0 on compute-0
Dec  1 04:13:40 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/592707641' entity='client.admin' 
Dec  1 04:13:40 np0005540741 podman[82878]: 2025-12-01 09:13:40.763542899 +0000 UTC m=+0.029761704 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:40 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:40 np0005540741 podman[82915]: 2025-12-01 09:13:40.864702491 +0000 UTC m=+0.054522179 container create 83d60e6b432ce4cbd9a76d9ee4c24e49cbc1130ab3ceccbddaa8851b48170ec1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec  1 04:13:40 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c948808cf65ac18d9238ee6ad166ab88d63fb9149c76040ce59b26af013995e/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:40 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c948808cf65ac18d9238ee6ad166ab88d63fb9149c76040ce59b26af013995e/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:40 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c948808cf65ac18d9238ee6ad166ab88d63fb9149c76040ce59b26af013995e/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:40 np0005540741 podman[82878]: 2025-12-01 09:13:40.885643503 +0000 UTC m=+0.151862308 container init 3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50 (image=quay.io/ceph/ceph:v18, name=gallant_bassi, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec  1 04:13:40 np0005540741 podman[82878]: 2025-12-01 09:13:40.895018711 +0000 UTC m=+0.161237496 container start 3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50 (image=quay.io/ceph/ceph:v18, name=gallant_bassi, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:13:40 np0005540741 podman[82878]: 2025-12-01 09:13:40.899151054 +0000 UTC m=+0.165370129 container attach 3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50 (image=quay.io/ceph/ceph:v18, name=gallant_bassi, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  1 04:13:40 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac2cec797eb7bef878699279aa8ee94afbcebe6e9e58528317e8ac321c09be38/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:40 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac2cec797eb7bef878699279aa8ee94afbcebe6e9e58528317e8ac321c09be38/merged/etc/ceph/ceph.client.crash.compute-0.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:40 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac2cec797eb7bef878699279aa8ee94afbcebe6e9e58528317e8ac321c09be38/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:40 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac2cec797eb7bef878699279aa8ee94afbcebe6e9e58528317e8ac321c09be38/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:40 np0005540741 podman[82915]: 2025-12-01 09:13:40.928280258 +0000 UTC m=+0.118099966 container init 83d60e6b432ce4cbd9a76d9ee4c24e49cbc1130ab3ceccbddaa8851b48170ec1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:13:40 np0005540741 podman[82915]: 2025-12-01 09:13:40.838438822 +0000 UTC m=+0.028258560 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:13:40 np0005540741 podman[82915]: 2025-12-01 09:13:40.935115651 +0000 UTC m=+0.124935339 container start 83d60e6b432ce4cbd9a76d9ee4c24e49cbc1130ab3ceccbddaa8851b48170ec1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:13:40 np0005540741 bash[82915]: 83d60e6b432ce4cbd9a76d9ee4c24e49cbc1130ab3ceccbddaa8851b48170ec1
Dec  1 04:13:40 np0005540741 systemd[1]: Started Ceph crash.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec  1 04:13:40 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:13:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:13:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.crash}] v 0) v1
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:41 np0005540741 ceph-mgr[75324]: [progress INFO root] complete: finished ev c5abd3f8-9653-4f8e-81e4-4aa8afc043ca (Updating crash deployment (+1 -> 1))
Dec  1 04:13:41 np0005540741 ceph-mgr[75324]: [progress INFO root] Completed event c5abd3f8-9653-4f8e-81e4-4aa8afc043ca (Updating crash deployment (+1 -> 1)) in 2 seconds
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.crash}] v 0) v1
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:41 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev b641bac5-9918-45ef-846f-b436360b0fe4 does not exist
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) v1
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:41 np0005540741 ceph-mgr[75324]: [progress INFO root] update: starting ev 0ad1d74c-45e7-464b-841d-9ea23a988291 (Updating mgr deployment (+1 -> 2))
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.compute-0.htextg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) v1
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.htextg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.htextg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:13:41 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Deploying daemon mgr.compute-0.htextg on compute-0
Dec  1 04:13:41 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Deploying daemon mgr.compute-0.htextg on compute-0
Dec  1 04:13:41 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: INFO:ceph-crash:pinging cluster to exercise our key
Dec  1 04:13:41 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: 2025-12-01T09:13:41.359+0000 7f28ddf40640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec  1 04:13:41 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: 2025-12-01T09:13:41.359+0000 7f28ddf40640 -1 AuthRegistry(0x7f28d8067440) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec  1 04:13:41 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: 2025-12-01T09:13:41.360+0000 7f28ddf40640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec  1 04:13:41 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: 2025-12-01T09:13:41.360+0000 7f28ddf40640 -1 AuthRegistry(0x7f28ddf3f000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec  1 04:13:41 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: 2025-12-01T09:13:41.362+0000 7f28d77fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec  1 04:13:41 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: 2025-12-01T09:13:41.362+0000 7f28ddf40640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Dec  1 04:13:41 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: [errno 13] RADOS permission denied (error connecting to the cluster)
Dec  1 04:13:41 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd set-require-min-compat-client", "version": "mimic"} v 0) v1
Dec  1 04:13:41 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3480666797' entity='client.admin' cmd=[{"prefix": "osd set-require-min-compat-client", "version": "mimic"}]: dispatch
Dec  1 04:13:41 np0005540741 podman[83113]: 2025-12-01 09:13:41.646531552 +0000 UTC m=+0.041915755 container create f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_joliot, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec  1 04:13:41 np0005540741 systemd[1]: Started libpod-conmon-f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c.scope.
Dec  1 04:13:41 np0005540741 podman[83113]: 2025-12-01 09:13:41.627419465 +0000 UTC m=+0.022803688 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:13:41 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:41 np0005540741 podman[83113]: 2025-12-01 09:13:41.749957181 +0000 UTC m=+0.145341404 container init f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_joliot, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:13:41 np0005540741 podman[83113]: 2025-12-01 09:13:41.75631872 +0000 UTC m=+0.151702923 container start f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_joliot, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:13:41 np0005540741 podman[83113]: 2025-12-01 09:13:41.75935261 +0000 UTC m=+0.154736833 container attach f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_joliot, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:13:41 np0005540741 elastic_joliot[83130]: 167 167
Dec  1 04:13:41 np0005540741 systemd[1]: libpod-f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c.scope: Deactivated successfully.
Dec  1 04:13:41 np0005540741 podman[83113]: 2025-12-01 09:13:41.763901515 +0000 UTC m=+0.159285738 container died f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_joliot, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  1 04:13:41 np0005540741 systemd[1]: var-lib-containers-storage-overlay-4959ada2c90cbf80f64e33b7ae0a84d1f46a8b734b633881957146213b6c9b8e-merged.mount: Deactivated successfully.
Dec  1 04:13:41 np0005540741 podman[83113]: 2025-12-01 09:13:41.838400385 +0000 UTC m=+0.233784588 container remove f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_joliot, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec  1 04:13:41 np0005540741 systemd[1]: libpod-conmon-f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c.scope: Deactivated successfully.
Dec  1 04:13:41 np0005540741 systemd[1]: Reloading.
Dec  1 04:13:41 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:13:41 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.htextg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.htextg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: Deploying daemon mgr.compute-0.htextg on compute-0
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/3480666797' entity='client.admin' cmd=[{"prefix": "osd set-require-min-compat-client", "version": "mimic"}]: dispatch
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 do_prune osdmap full prune enabled
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 encode_pending skipping prime_pg_temp; mapping job did not start
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3480666797' entity='client.admin' cmd='[{"prefix": "osd set-require-min-compat-client", "version": "mimic"}]': finished
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e3 e3: 0 total, 0 up, 0 in
Dec  1 04:13:42 np0005540741 gallant_bassi[82927]: set require_min_compat_client to mimic
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e3: 0 total, 0 up, 0 in
Dec  1 04:13:42 np0005540741 podman[82878]: 2025-12-01 09:13:42.065119844 +0000 UTC m=+1.331338649 container died 3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50 (image=quay.io/ceph/ceph:v18, name=gallant_bassi, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:13:42 np0005540741 systemd[1]: libpod-3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50.scope: Deactivated successfully.
Dec  1 04:13:42 np0005540741 systemd[1]: var-lib-containers-storage-overlay-6c948808cf65ac18d9238ee6ad166ab88d63fb9149c76040ce59b26af013995e-merged.mount: Deactivated successfully.
Dec  1 04:13:42 np0005540741 podman[82878]: 2025-12-01 09:13:42.197804071 +0000 UTC m=+1.464022856 container remove 3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50 (image=quay.io/ceph/ceph:v18, name=gallant_bassi, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec  1 04:13:42 np0005540741 systemd[1]: libpod-conmon-3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50.scope: Deactivated successfully.
Dec  1 04:13:42 np0005540741 systemd[1]: Reloading.
Dec  1 04:13:42 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:13:42 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:13:42 np0005540741 systemd[1]: Starting Ceph mgr.compute-0.htextg for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec  1 04:13:42 np0005540741 podman[83314]: 2025-12-01 09:13:42.744423922 +0000 UTC m=+0.051681724 container create 81acc04116fbf4755e71fd111eeaa509f9560ae5f0189ee1e5afafef5403a9ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef)
Dec  1 04:13:42 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c2a8cd3227a37de55446263fb0fb96c95365808b997cd221ffa58d6e824aa46/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:42 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c2a8cd3227a37de55446263fb0fb96c95365808b997cd221ffa58d6e824aa46/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:42 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c2a8cd3227a37de55446263fb0fb96c95365808b997cd221ffa58d6e824aa46/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:42 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c2a8cd3227a37de55446263fb0fb96c95365808b997cd221ffa58d6e824aa46/merged/var/lib/ceph/mgr/ceph-compute-0.htextg supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:42 np0005540741 podman[83314]: 2025-12-01 09:13:42.814054349 +0000 UTC m=+0.121312171 container init 81acc04116fbf4755e71fd111eeaa509f9560ae5f0189ee1e5afafef5403a9ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:13:42 np0005540741 python3[83308]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch apply --in-file /home/ceph_spec.yaml _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:13:42 np0005540741 podman[83314]: 2025-12-01 09:13:42.820271063 +0000 UTC m=+0.127528865 container start 81acc04116fbf4755e71fd111eeaa509f9560ae5f0189ee1e5afafef5403a9ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:13:42 np0005540741 podman[83314]: 2025-12-01 09:13:42.72647413 +0000 UTC m=+0.033731962 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:13:42 np0005540741 bash[83314]: 81acc04116fbf4755e71fd111eeaa509f9560ae5f0189ee1e5afafef5403a9ce
Dec  1 04:13:42 np0005540741 systemd[1]: Started Ceph mgr.compute-0.htextg for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec  1 04:13:42 np0005540741 ceph-mgr[83335]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:13:42 np0005540741 ceph-mgr[83335]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Dec  1 04:13:42 np0005540741 ceph-mgr[83335]: pidfile_write: ignore empty --pid-file
Dec  1 04:13:42 np0005540741 podman[83334]: 2025-12-01 09:13:42.922430985 +0000 UTC m=+0.082055296 container create 0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f (image=quay.io/ceph/ceph:v18, name=recursing_euclid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e3 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:13:42 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:13:42 np0005540741 podman[83334]: 2025-12-01 09:13:42.871792992 +0000 UTC m=+0.031417333 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:13:42 np0005540741 systemd[1]: Started libpod-conmon-0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f.scope.
Dec  1 04:13:42 np0005540741 ceph-mgr[83335]: mgr[py] Loading python module 'alerts'
Dec  1 04:13:43 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:43 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:43 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/781a1fb8e63b0412ffeeb441cad9f2807ea13a9cc0ba03a7d3f142e2768a3e86/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:43 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/781a1fb8e63b0412ffeeb441cad9f2807ea13a9cc0ba03a7d3f142e2768a3e86/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:43 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/781a1fb8e63b0412ffeeb441cad9f2807ea13a9cc0ba03a7d3f142e2768a3e86/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) v1
Dec  1 04:13:43 np0005540741 ceph-mgr[75324]: [progress INFO root] Writing back 1 completed events
Dec  1 04:13:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:13:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:13:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:13:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:13:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) v1
Dec  1 04:13:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:13:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:13:43 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:43 np0005540741 podman[83334]: 2025-12-01 09:13:43.054918317 +0000 UTC m=+0.214542678 container init 0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f (image=quay.io/ceph/ceph:v18, name=recursing_euclid, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:13:43 np0005540741 ceph-mgr[75324]: [progress INFO root] complete: finished ev 0ad1d74c-45e7-464b-841d-9ea23a988291 (Updating mgr deployment (+1 -> 2))
Dec  1 04:13:43 np0005540741 ceph-mgr[75324]: [progress INFO root] Completed event 0ad1d74c-45e7-464b-841d-9ea23a988291 (Updating mgr deployment (+1 -> 2)) in 2 seconds
Dec  1 04:13:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) v1
Dec  1 04:13:43 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/3480666797' entity='client.admin' cmd='[{"prefix": "osd set-require-min-compat-client", "version": "mimic"}]': finished
Dec  1 04:13:43 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:43 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:43 np0005540741 podman[83334]: 2025-12-01 09:13:43.063961165 +0000 UTC m=+0.223585476 container start 0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f (image=quay.io/ceph/ceph:v18, name=recursing_euclid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec  1 04:13:43 np0005540741 podman[83334]: 2025-12-01 09:13:43.067857641 +0000 UTC m=+0.227481962 container attach 0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f (image=quay.io/ceph/ceph:v18, name=recursing_euclid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:13:43 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:43 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:43 np0005540741 ceph-mgr[83335]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:13:43 np0005540741 ceph-mgr[83335]: mgr[py] Loading python module 'balancer'
Dec  1 04:13:43 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg[83330]: 2025-12-01T09:13:43.411+0000 7fc783261140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:13:43 np0005540741 ceph-mgr[83335]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:13:43 np0005540741 ceph-mgr[83335]: mgr[py] Loading python module 'cephadm'
Dec  1 04:13:43 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg[83330]: 2025-12-01T09:13:43.729+0000 7fc783261140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:13:43 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14186 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 podman[83712]: 2025-12-01 09:13:44.162046511 +0000 UTC m=+0.059143406 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:13:44 np0005540741 podman[83712]: 2025-12-01 09:13:44.285082563 +0000 UTC m=+0.182179448 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) v1
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) v1
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) v1
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) v1
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Added host compute-0
Dec  1 04:13:44 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Added host compute-0
Dec  1 04:13:44 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Saving service mon spec with placement compute-0
Dec  1 04:13:44 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service mon spec with placement compute-0
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) v1
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Saving service mgr spec with placement compute-0
Dec  1 04:13:44 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service mgr spec with placement compute-0
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) v1
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Marking host: compute-0 for OSDSpec preview refresh.
Dec  1 04:13:44 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Marking host: compute-0 for OSDSpec preview refresh.
Dec  1 04:13:44 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Saving service osd.default_drive_group spec with placement compute-0
Dec  1 04:13:44 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service osd.default_drive_group spec with placement compute-0
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.osd.default_drive_group}] v 0) v1
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 recursing_euclid[83375]: Added host 'compute-0' with addr '192.168.122.100'
Dec  1 04:13:44 np0005540741 recursing_euclid[83375]: Scheduled mon update...
Dec  1 04:13:44 np0005540741 recursing_euclid[83375]: Scheduled mgr update...
Dec  1 04:13:44 np0005540741 recursing_euclid[83375]: Scheduled osd.default_drive_group update...
Dec  1 04:13:44 np0005540741 systemd[1]: libpod-0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f.scope: Deactivated successfully.
Dec  1 04:13:44 np0005540741 podman[83334]: 2025-12-01 09:13:44.438617329 +0000 UTC m=+1.598241660 container died 0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f (image=quay.io/ceph/ceph:v18, name=recursing_euclid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec  1 04:13:44 np0005540741 systemd[1]: var-lib-containers-storage-overlay-781a1fb8e63b0412ffeeb441cad9f2807ea13a9cc0ba03a7d3f142e2768a3e86-merged.mount: Deactivated successfully.
Dec  1 04:13:44 np0005540741 podman[83334]: 2025-12-01 09:13:44.515110569 +0000 UTC m=+1.674734880 container remove 0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f (image=quay.io/ceph/ceph:v18, name=recursing_euclid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec  1 04:13:44 np0005540741 systemd[1]: libpod-conmon-0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f.scope: Deactivated successfully.
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 52a4df2c-eef2-44c2-ae44-71aaae30c143 does not exist
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) v1
Dec  1 04:13:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:44 np0005540741 ceph-mgr[75324]: [progress INFO root] update: starting ev a0021272-0d49-43fd-b7af-09d21e2bb5e5 (Updating mgr deployment (-1 -> 1))
Dec  1 04:13:44 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Removing daemon mgr.compute-0.htextg from compute-0 -- ports [8765]
Dec  1 04:13:44 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Removing daemon mgr.compute-0.htextg from compute-0 -- ports [8765]
Dec  1 04:13:44 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:13:44 np0005540741 python3[83886]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   status --format json | jq .osdmap.num_up_osds _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:13:45 np0005540741 podman[83958]: 2025-12-01 09:13:45.06476909 +0000 UTC m=+0.053854209 container create 5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72 (image=quay.io/ceph/ceph:v18, name=objective_davinci, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Dec  1 04:13:45 np0005540741 systemd[1]: Started libpod-conmon-5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72.scope.
Dec  1 04:13:45 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:45 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/042dc70dd7bc9edd0275eb290c5e7b3e95029da508c8d83d23a5f5afc7241b8e/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:45 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/042dc70dd7bc9edd0275eb290c5e7b3e95029da508c8d83d23a5f5afc7241b8e/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:45 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/042dc70dd7bc9edd0275eb290c5e7b3e95029da508c8d83d23a5f5afc7241b8e/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:45 np0005540741 podman[83958]: 2025-12-01 09:13:45.048463016 +0000 UTC m=+0.037548135 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:13:45 np0005540741 podman[83958]: 2025-12-01 09:13:45.13859935 +0000 UTC m=+0.127684489 container init 5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72 (image=quay.io/ceph/ceph:v18, name=objective_davinci, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec  1 04:13:45 np0005540741 podman[83958]: 2025-12-01 09:13:45.147754392 +0000 UTC m=+0.136839511 container start 5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72 (image=quay.io/ceph/ceph:v18, name=objective_davinci, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:13:45 np0005540741 podman[83958]: 2025-12-01 09:13:45.153040829 +0000 UTC m=+0.142125938 container attach 5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72 (image=quay.io/ceph/ceph:v18, name=objective_davinci, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True)
Dec  1 04:13:45 np0005540741 systemd[1]: Stopping Ceph mgr.compute-0.htextg for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: Added host compute-0
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: Saving service mon spec with placement compute-0
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: Saving service mgr spec with placement compute-0
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: Marking host: compute-0 for OSDSpec preview refresh.
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: Saving service osd.default_drive_group spec with placement compute-0
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: Removing daemon mgr.compute-0.htextg from compute-0 -- ports [8765]
Dec  1 04:13:45 np0005540741 podman[84053]: 2025-12-01 09:13:45.557846422 +0000 UTC m=+0.106772270 container died 81acc04116fbf4755e71fd111eeaa509f9560ae5f0189ee1e5afafef5403a9ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec  1 04:13:45 np0005540741 systemd[1]: var-lib-containers-storage-overlay-2c2a8cd3227a37de55446263fb0fb96c95365808b997cd221ffa58d6e824aa46-merged.mount: Deactivated successfully.
Dec  1 04:13:45 np0005540741 podman[84053]: 2025-12-01 09:13:45.625660934 +0000 UTC m=+0.174586782 container remove 81acc04116fbf4755e71fd111eeaa509f9560ae5f0189ee1e5afafef5403a9ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507)
Dec  1 04:13:45 np0005540741 bash[84053]: ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg
Dec  1 04:13:45 np0005540741 systemd[1]: ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@mgr.compute-0.htextg.service: Main process exited, code=exited, status=143/n/a
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json"} v 0) v1
Dec  1 04:13:45 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2938897347' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec  1 04:13:45 np0005540741 objective_davinci[83973]: 
Dec  1 04:13:45 np0005540741 objective_davinci[83973]: {"fsid":"5620a9fb-e540-5250-a0e8-7aaad5347e3b","health":{"status":"HEALTH_WARN","checks":{"TOO_FEW_OSDS":{"severity":"HEALTH_WARN","summary":{"message":"OSD count 0 < osd_pool_default_size 1","count":1},"muted":false}},"mutes":[]},"election_epoch":5,"quorum":[0],"quorum_names":["compute-0"],"quorum_age":82,"monmap":{"epoch":1,"min_mon_release_name":"reef","num_mons":1},"osdmap":{"epoch":3,"num_osds":0,"num_up_osds":0,"osd_up_since":0,"num_in_osds":0,"osd_in_since":0,"num_remapped_pgs":0},"pgmap":{"pgs_by_state":[],"num_pgs":0,"num_pools":0,"num_objects":0,"data_bytes":0,"bytes_used":0,"bytes_avail":0,"bytes_total":0},"fsmap":{"epoch":1,"by_rank":[],"up:standby":0},"mgrmap":{"available":true,"num_standbys":0,"modules":["cephadm","iostat","nfs","restful"],"services":{}},"servicemap":{"epoch":1,"modified":"2025-12-01T09:12:20.101670+0000","services":{}},"progress_events":{"0ad1d74c-45e7-464b-841d-9ea23a988291":{"message":"Updating mgr deployment (+1 -> 2) (0s)\n      [............................] ","progress":0,"add_to_ceph_s":true}}}
Dec  1 04:13:45 np0005540741 systemd[1]: libpod-5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72.scope: Deactivated successfully.
Dec  1 04:13:45 np0005540741 conmon[83973]: conmon 5912e43bcb69d1324ee3 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72.scope/container/memory.events
Dec  1 04:13:45 np0005540741 podman[83958]: 2025-12-01 09:13:45.835070908 +0000 UTC m=+0.824156027 container died 5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72 (image=quay.io/ceph/ceph:v18, name=objective_davinci, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:13:45 np0005540741 systemd[1]: var-lib-containers-storage-overlay-042dc70dd7bc9edd0275eb290c5e7b3e95029da508c8d83d23a5f5afc7241b8e-merged.mount: Deactivated successfully.
Dec  1 04:13:46 np0005540741 podman[83958]: 2025-12-01 09:13:46.040735182 +0000 UTC m=+1.029820301 container remove 5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72 (image=quay.io/ceph/ceph:v18, name=objective_davinci, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:13:46 np0005540741 systemd[1]: libpod-conmon-5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72.scope: Deactivated successfully.
Dec  1 04:13:46 np0005540741 systemd[1]: ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@mgr.compute-0.htextg.service: Failed with result 'exit-code'.
Dec  1 04:13:46 np0005540741 systemd[1]: Stopped Ceph mgr.compute-0.htextg for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec  1 04:13:46 np0005540741 systemd[1]: ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@mgr.compute-0.htextg.service: Consumed 3.727s CPU time.
Dec  1 04:13:46 np0005540741 systemd[1]: Reloading.
Dec  1 04:13:46 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:13:46 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:13:46 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.services.cephadmservice] Removing key for mgr.compute-0.htextg
Dec  1 04:13:46 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Removing key for mgr.compute-0.htextg
Dec  1 04:13:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth rm", "entity": "mgr.compute-0.htextg"} v 0) v1
Dec  1 04:13:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth rm", "entity": "mgr.compute-0.htextg"}]: dispatch
Dec  1 04:13:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth rm", "entity": "mgr.compute-0.htextg"}]': finished
Dec  1 04:13:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) v1
Dec  1 04:13:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:46 np0005540741 ceph-mgr[75324]: [progress INFO root] complete: finished ev a0021272-0d49-43fd-b7af-09d21e2bb5e5 (Updating mgr deployment (-1 -> 1))
Dec  1 04:13:46 np0005540741 ceph-mgr[75324]: [progress INFO root] Completed event a0021272-0d49-43fd-b7af-09d21e2bb5e5 (Updating mgr deployment (-1 -> 1)) in 2 seconds
Dec  1 04:13:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) v1
Dec  1 04:13:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:46 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 83de6e1c-582f-40e3-9017-57f8f353e9e3 does not exist
Dec  1 04:13:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:13:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:13:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:13:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:13:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:13:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:13:46 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v11: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:13:47 np0005540741 podman[84315]: 2025-12-01 09:13:47.047013324 +0000 UTC m=+0.051451748 container create 7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_pascal, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:13:47 np0005540741 systemd[1]: Started libpod-conmon-7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf.scope.
Dec  1 04:13:47 np0005540741 podman[84315]: 2025-12-01 09:13:47.025173936 +0000 UTC m=+0.029612400 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:13:47 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:47 np0005540741 podman[84315]: 2025-12-01 09:13:47.15001186 +0000 UTC m=+0.154450294 container init 7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_pascal, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:13:47 np0005540741 podman[84315]: 2025-12-01 09:13:47.157336528 +0000 UTC m=+0.161774942 container start 7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_pascal, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:13:47 np0005540741 magical_pascal[84332]: 167 167
Dec  1 04:13:47 np0005540741 systemd[1]: libpod-7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf.scope: Deactivated successfully.
Dec  1 04:13:47 np0005540741 podman[84315]: 2025-12-01 09:13:47.177060143 +0000 UTC m=+0.181498547 container attach 7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_pascal, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec  1 04:13:47 np0005540741 podman[84315]: 2025-12-01 09:13:47.17796415 +0000 UTC m=+0.182402574 container died 7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_pascal, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec  1 04:13:47 np0005540741 systemd[1]: var-lib-containers-storage-overlay-3209a735490b9315d8878b71ea8f8959e60d36f0c3e955bfe2843293b3befc36-merged.mount: Deactivated successfully.
Dec  1 04:13:47 np0005540741 podman[84315]: 2025-12-01 09:13:47.218585935 +0000 UTC m=+0.223024349 container remove 7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_pascal, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:13:47 np0005540741 systemd[1]: libpod-conmon-7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf.scope: Deactivated successfully.
Dec  1 04:13:47 np0005540741 podman[84355]: 2025-12-01 09:13:47.382690305 +0000 UTC m=+0.026745445 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:13:47 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e3 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:13:47 np0005540741 podman[84355]: 2025-12-01 09:13:47.988212444 +0000 UTC m=+0.632267564 container create a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_goldwasser, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 04:13:47 np0005540741 ceph-mon[75031]: Removing key for mgr.compute-0.htextg
Dec  1 04:13:47 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth rm", "entity": "mgr.compute-0.htextg"}]: dispatch
Dec  1 04:13:47 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth rm", "entity": "mgr.compute-0.htextg"}]': finished
Dec  1 04:13:47 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:47 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:47 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:13:48 np0005540741 ceph-mgr[75324]: [progress INFO root] Writing back 3 completed events
Dec  1 04:13:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) v1
Dec  1 04:13:48 np0005540741 systemd[1]: Started libpod-conmon-a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44.scope.
Dec  1 04:13:48 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:48 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:13:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74c7265ba961e34952af6201846e515b128d80cd0e7d4202dbbaa89b7350da2c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74c7265ba961e34952af6201846e515b128d80cd0e7d4202dbbaa89b7350da2c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74c7265ba961e34952af6201846e515b128d80cd0e7d4202dbbaa89b7350da2c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74c7265ba961e34952af6201846e515b128d80cd0e7d4202dbbaa89b7350da2c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74c7265ba961e34952af6201846e515b128d80cd0e7d4202dbbaa89b7350da2c/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:13:48 np0005540741 podman[84355]: 2025-12-01 09:13:48.157498318 +0000 UTC m=+0.801553448 container init a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_goldwasser, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 04:13:48 np0005540741 podman[84355]: 2025-12-01 09:13:48.166839665 +0000 UTC m=+0.810894785 container start a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_goldwasser, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef)
Dec  1 04:13:48 np0005540741 podman[84355]: 2025-12-01 09:13:48.1774227 +0000 UTC m=+0.821477830 container attach a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_goldwasser, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:13:48 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v12: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:13:49 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:13:49 np0005540741 affectionate_goldwasser[84371]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:13:49 np0005540741 affectionate_goldwasser[84371]: --> relative data size: 1.0
Dec  1 04:13:49 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:13:49 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 9cfc4d29-4b80-4e2d-94cb-e544135847a5
Dec  1 04:13:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd new", "uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5"} v 0) v1
Dec  1 04:13:49 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3687895600' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5"}]: dispatch
Dec  1 04:13:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e3 do_prune osdmap full prune enabled
Dec  1 04:13:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e3 encode_pending skipping prime_pg_temp; mapping job did not start
Dec  1 04:13:49 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3687895600' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5"}]': finished
Dec  1 04:13:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e4 e4: 1 total, 0 up, 1 in
Dec  1 04:13:49 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e4: 1 total, 0 up, 1 in
Dec  1 04:13:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:13:49 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:13:49 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec  1 04:13:49 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:13:49 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0
Dec  1 04:13:49 np0005540741 lvm[84432]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:13:49 np0005540741 lvm[84432]: VG ceph_vg0 finished
Dec  1 04:13:49 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Dec  1 04:13:49 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  1 04:13:49 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec  1 04:13:49 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap
Dec  1 04:13:50 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/3687895600' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5"}]: dispatch
Dec  1 04:13:50 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/3687895600' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5"}]': finished
Dec  1 04:13:50 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon getmap"} v 0) v1
Dec  1 04:13:50 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2261522954' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Dec  1 04:13:50 np0005540741 affectionate_goldwasser[84371]: stderr: got monmap epoch 1
Dec  1 04:13:50 np0005540741 affectionate_goldwasser[84371]: --> Creating keyring file for osd.0
Dec  1 04:13:50 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring
Dec  1 04:13:50 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/
Dec  1 04:13:50 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid 9cfc4d29-4b80-4e2d-94cb-e544135847a5 --setuser ceph --setgroup ceph
Dec  1 04:13:50 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v14: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:13:51 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : Health check cleared: TOO_FEW_OSDS (was: OSD count 0 < osd_pool_default_size 1)
Dec  1 04:13:51 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : Cluster is now healthy
Dec  1 04:13:52 np0005540741 ceph-mon[75031]: Health check cleared: TOO_FEW_OSDS (was: OSD count 0 < osd_pool_default_size 1)
Dec  1 04:13:52 np0005540741 ceph-mon[75031]: Cluster is now healthy
Dec  1 04:13:52 np0005540741 affectionate_goldwasser[84371]: stderr: 2025-12-01T09:13:50.419+0000 7f73f023d740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec  1 04:13:52 np0005540741 affectionate_goldwasser[84371]: stderr: 2025-12-01T09:13:50.419+0000 7f73f023d740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec  1 04:13:52 np0005540741 affectionate_goldwasser[84371]: stderr: 2025-12-01T09:13:50.419+0000 7f73f023d740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec  1 04:13:52 np0005540741 affectionate_goldwasser[84371]: stderr: 2025-12-01T09:13:50.419+0000 7f73f023d740 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid
Dec  1 04:13:52 np0005540741 affectionate_goldwasser[84371]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Dec  1 04:13:52 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  1 04:13:52 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Dec  1 04:13:52 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e4 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:13:52 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec  1 04:13:52 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Dec  1 04:13:52 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  1 04:13:52 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  1 04:13:52 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v15: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:13:52 np0005540741 affectionate_goldwasser[84371]: --> ceph-volume lvm activate successful for osd ID: 0
Dec  1 04:13:52 np0005540741 affectionate_goldwasser[84371]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Dec  1 04:13:53 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:13:53 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new b055e1b3-f94e-4d5e-be04-bafc3cd07aa2
Dec  1 04:13:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd new", "uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2"} v 0) v1
Dec  1 04:13:53 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3614368394' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2"}]: dispatch
Dec  1 04:13:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e4 do_prune osdmap full prune enabled
Dec  1 04:13:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e4 encode_pending skipping prime_pg_temp; mapping job did not start
Dec  1 04:13:53 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3614368394' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2"}]': finished
Dec  1 04:13:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e5 e5: 2 total, 0 up, 2 in
Dec  1 04:13:53 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e5: 2 total, 0 up, 2 in
Dec  1 04:13:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:13:53 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:13:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:13:53 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:13:53 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec  1 04:13:53 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:13:53 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/3614368394' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2"}]: dispatch
Dec  1 04:13:53 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/3614368394' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2"}]': finished
Dec  1 04:13:53 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:13:53 np0005540741 lvm[85371]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec  1 04:13:53 np0005540741 lvm[85371]: VG ceph_vg1 finished
Dec  1 04:13:53 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-1
Dec  1 04:13:53 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1
Dec  1 04:13:53 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec  1 04:13:53 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-1/block
Dec  1 04:13:53 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-1/activate.monmap
Dec  1 04:13:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon getmap"} v 0) v1
Dec  1 04:13:53 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1284577700' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Dec  1 04:13:53 np0005540741 affectionate_goldwasser[84371]: stderr: got monmap epoch 1
Dec  1 04:13:54 np0005540741 affectionate_goldwasser[84371]: --> Creating keyring file for osd.1
Dec  1 04:13:54 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/keyring
Dec  1 04:13:54 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/
Dec  1 04:13:54 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 1 --monmap /var/lib/ceph/osd/ceph-1/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-1/ --osd-uuid b055e1b3-f94e-4d5e-be04-bafc3cd07aa2 --setuser ceph --setgroup ceph
Dec  1 04:13:54 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v17: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:13:56 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v18: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: stderr: 2025-12-01T09:13:54.102+0000 7f3e917a0740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: stderr: 2025-12-01T09:13:54.102+0000 7f3e917a0740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: stderr: 2025-12-01T09:13:54.102+0000 7f3e917a0740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: stderr: 2025-12-01T09:13:54.102+0000 7f3e917a0740 -1 bluestore(/var/lib/ceph/osd/ceph-1/) _read_fsid unparsable uuid
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-1/block
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: --> ceph-volume lvm activate successful for osd ID: 1
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new c0c71a6c-e9f0-420a-90ae-6660eaf041be
Dec  1 04:13:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd new", "uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be"} v 0) v1
Dec  1 04:13:57 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/407475950' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be"}]: dispatch
Dec  1 04:13:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e5 do_prune osdmap full prune enabled
Dec  1 04:13:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e5 encode_pending skipping prime_pg_temp; mapping job did not start
Dec  1 04:13:57 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/407475950' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be"}]': finished
Dec  1 04:13:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e6 e6: 3 total, 0 up, 3 in
Dec  1 04:13:57 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e6: 3 total, 0 up, 3 in
Dec  1 04:13:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:13:57 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:13:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:13:57 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:13:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:13:57 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:13:57 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec  1 04:13:57 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:13:57 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:13:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e6 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:13:57 np0005540741 lvm[86308]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec  1 04:13:57 np0005540741 lvm[86308]: VG ceph_vg2 finished
Dec  1 04:13:57 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:13:58 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Dec  1 04:13:58 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg2/ceph_lv2
Dec  1 04:13:58 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2
Dec  1 04:13:58 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ln -s /dev/ceph_vg2/ceph_lv2 /var/lib/ceph/osd/ceph-2/block
Dec  1 04:13:58 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Dec  1 04:13:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon getmap"} v 0) v1
Dec  1 04:13:58 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/54861264' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Dec  1 04:13:58 np0005540741 affectionate_goldwasser[84371]: stderr: got monmap epoch 1
Dec  1 04:13:58 np0005540741 affectionate_goldwasser[84371]: --> Creating keyring file for osd.2
Dec  1 04:13:58 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Dec  1 04:13:58 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Dec  1 04:13:58 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid c0c71a6c-e9f0-420a-90ae-6660eaf041be --setuser ceph --setgroup ceph
Dec  1 04:13:58 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v20: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:13:59 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/407475950' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be"}]: dispatch
Dec  1 04:13:59 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/407475950' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be"}]': finished
Dec  1 04:14:00 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v21: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:14:02 np0005540741 affectionate_goldwasser[84371]: stderr: 2025-12-01T09:13:58.536+0000 7f8f66a21740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec  1 04:14:02 np0005540741 affectionate_goldwasser[84371]: stderr: 2025-12-01T09:13:58.536+0000 7f8f66a21740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec  1 04:14:02 np0005540741 affectionate_goldwasser[84371]: stderr: 2025-12-01T09:13:58.536+0000 7f8f66a21740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec  1 04:14:02 np0005540741 affectionate_goldwasser[84371]: stderr: 2025-12-01T09:13:58.536+0000 7f8f66a21740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Dec  1 04:14:02 np0005540741 affectionate_goldwasser[84371]: --> ceph-volume lvm prepare successful for: ceph_vg2/ceph_lv2
Dec  1 04:14:02 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  1 04:14:02 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg2/ceph_lv2 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Dec  1 04:14:02 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/ln -snf /dev/ceph_vg2/ceph_lv2 /var/lib/ceph/osd/ceph-2/block
Dec  1 04:14:02 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Dec  1 04:14:02 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2
Dec  1 04:14:02 np0005540741 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  1 04:14:02 np0005540741 affectionate_goldwasser[84371]: --> ceph-volume lvm activate successful for osd ID: 2
Dec  1 04:14:02 np0005540741 affectionate_goldwasser[84371]: --> ceph-volume lvm create successful for: ceph_vg2/ceph_lv2
Dec  1 04:14:02 np0005540741 systemd[1]: libpod-a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44.scope: Deactivated successfully.
Dec  1 04:14:02 np0005540741 systemd[1]: libpod-a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44.scope: Consumed 6.871s CPU time.
Dec  1 04:14:02 np0005540741 podman[87222]: 2025-12-01 09:14:02.656689128 +0000 UTC m=+0.029793369 container died a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_goldwasser, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:14:02 np0005540741 systemd[1]: var-lib-containers-storage-overlay-74c7265ba961e34952af6201846e515b128d80cd0e7d4202dbbaa89b7350da2c-merged.mount: Deactivated successfully.
Dec  1 04:14:02 np0005540741 podman[87222]: 2025-12-01 09:14:02.7466415 +0000 UTC m=+0.119745721 container remove a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_goldwasser, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:02 np0005540741 systemd[1]: libpod-conmon-a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44.scope: Deactivated successfully.
Dec  1 04:14:02 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e6 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:14:02 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v22: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:14:03 np0005540741 podman[87376]: 2025-12-01 09:14:03.332683353 +0000 UTC m=+0.042444155 container create 21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_blackburn, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec  1 04:14:03 np0005540741 systemd[1]: Started libpod-conmon-21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46.scope.
Dec  1 04:14:03 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:03 np0005540741 podman[87376]: 2025-12-01 09:14:03.400848321 +0000 UTC m=+0.110609143 container init 21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_blackburn, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:03 np0005540741 podman[87376]: 2025-12-01 09:14:03.310503997 +0000 UTC m=+0.020264819 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:03 np0005540741 podman[87376]: 2025-12-01 09:14:03.409169565 +0000 UTC m=+0.118930377 container start 21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_blackburn, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec  1 04:14:03 np0005540741 podman[87376]: 2025-12-01 09:14:03.414008262 +0000 UTC m=+0.123769094 container attach 21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_blackburn, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:14:03 np0005540741 pensive_blackburn[87393]: 167 167
Dec  1 04:14:03 np0005540741 systemd[1]: libpod-21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46.scope: Deactivated successfully.
Dec  1 04:14:03 np0005540741 podman[87376]: 2025-12-01 09:14:03.415636082 +0000 UTC m=+0.125396884 container died 21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_blackburn, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:03 np0005540741 systemd[1]: var-lib-containers-storage-overlay-1e71f6d1e2062245aae319a32ca70d58fc0e1c77ba9019ffa5c7c1beba0d42e4-merged.mount: Deactivated successfully.
Dec  1 04:14:03 np0005540741 podman[87376]: 2025-12-01 09:14:03.451590578 +0000 UTC m=+0.161351380 container remove 21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_blackburn, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:03 np0005540741 systemd[1]: libpod-conmon-21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46.scope: Deactivated successfully.
Dec  1 04:14:03 np0005540741 podman[87418]: 2025-12-01 09:14:03.674250375 +0000 UTC m=+0.105748135 container create 6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_kepler, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:03 np0005540741 podman[87418]: 2025-12-01 09:14:03.590037798 +0000 UTC m=+0.021535578 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:03 np0005540741 systemd[1]: Started libpod-conmon-6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a.scope.
Dec  1 04:14:03 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:03 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f91b24bdf23bba306838d7ed18b395c389a542559bb195576461ee2fe9080490/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:03 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f91b24bdf23bba306838d7ed18b395c389a542559bb195576461ee2fe9080490/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:03 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f91b24bdf23bba306838d7ed18b395c389a542559bb195576461ee2fe9080490/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:03 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f91b24bdf23bba306838d7ed18b395c389a542559bb195576461ee2fe9080490/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:03 np0005540741 podman[87418]: 2025-12-01 09:14:03.751872121 +0000 UTC m=+0.183369901 container init 6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_kepler, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec  1 04:14:03 np0005540741 podman[87418]: 2025-12-01 09:14:03.762414302 +0000 UTC m=+0.193912062 container start 6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_kepler, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:14:03 np0005540741 podman[87418]: 2025-12-01 09:14:03.766080644 +0000 UTC m=+0.197578404 container attach 6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_kepler, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]: {
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:    "0": [
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:        {
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "devices": [
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "/dev/loop3"
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            ],
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "lv_name": "ceph_lv0",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "lv_size": "21470642176",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "name": "ceph_lv0",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "tags": {
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.cluster_name": "ceph",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.crush_device_class": "",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.encrypted": "0",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.osd_id": "0",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.type": "block",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.vdo": "0"
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            },
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "type": "block",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "vg_name": "ceph_vg0"
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:        }
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:    ],
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:    "1": [
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:        {
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "devices": [
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "/dev/loop4"
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            ],
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "lv_name": "ceph_lv1",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "lv_size": "21470642176",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "name": "ceph_lv1",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "tags": {
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.cluster_name": "ceph",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.crush_device_class": "",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.encrypted": "0",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.osd_id": "1",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.type": "block",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.vdo": "0"
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            },
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "type": "block",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "vg_name": "ceph_vg1"
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:        }
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:    ],
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:    "2": [
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:        {
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "devices": [
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "/dev/loop5"
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            ],
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "lv_name": "ceph_lv2",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "lv_size": "21470642176",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "name": "ceph_lv2",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "tags": {
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.cluster_name": "ceph",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.crush_device_class": "",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.encrypted": "0",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.osd_id": "2",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.type": "block",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:                "ceph.vdo": "0"
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            },
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "type": "block",
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:            "vg_name": "ceph_vg2"
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:        }
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]:    ]
Dec  1 04:14:04 np0005540741 optimistic_kepler[87434]: }
Dec  1 04:14:04 np0005540741 systemd[1]: libpod-6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a.scope: Deactivated successfully.
Dec  1 04:14:04 np0005540741 podman[87418]: 2025-12-01 09:14:04.579632483 +0000 UTC m=+1.011130253 container died 6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_kepler, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:04 np0005540741 systemd[1]: var-lib-containers-storage-overlay-f91b24bdf23bba306838d7ed18b395c389a542559bb195576461ee2fe9080490-merged.mount: Deactivated successfully.
Dec  1 04:14:04 np0005540741 podman[87418]: 2025-12-01 09:14:04.652935667 +0000 UTC m=+1.084433427 container remove 6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_kepler, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:14:04 np0005540741 systemd[1]: libpod-conmon-6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a.scope: Deactivated successfully.
Dec  1 04:14:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0) v1
Dec  1 04:14:04 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Dec  1 04:14:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:14:04 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:14:04 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Deploying daemon osd.0 on compute-0
Dec  1 04:14:04 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Deploying daemon osd.0 on compute-0
Dec  1 04:14:04 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v23: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:14:05 np0005540741 podman[87596]: 2025-12-01 09:14:05.278386432 +0000 UTC m=+0.035631287 container create 156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_jemison, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=reef)
Dec  1 04:14:05 np0005540741 systemd[1]: Started libpod-conmon-156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a.scope.
Dec  1 04:14:05 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:05 np0005540741 podman[87596]: 2025-12-01 09:14:05.355864984 +0000 UTC m=+0.113109879 container init 156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_jemison, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec  1 04:14:05 np0005540741 podman[87596]: 2025-12-01 09:14:05.263483888 +0000 UTC m=+0.020728763 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:05 np0005540741 podman[87596]: 2025-12-01 09:14:05.363130636 +0000 UTC m=+0.120375511 container start 156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_jemison, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:14:05 np0005540741 podman[87596]: 2025-12-01 09:14:05.366775797 +0000 UTC m=+0.124020652 container attach 156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_jemison, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:05 np0005540741 vigilant_jemison[87613]: 167 167
Dec  1 04:14:05 np0005540741 systemd[1]: libpod-156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a.scope: Deactivated successfully.
Dec  1 04:14:05 np0005540741 podman[87596]: 2025-12-01 09:14:05.369228332 +0000 UTC m=+0.126473217 container died 156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_jemison, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:14:05 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b126c9246aee2325621ddce099efc4957757572fdf530463c07d0480948e7d2d-merged.mount: Deactivated successfully.
Dec  1 04:14:05 np0005540741 podman[87596]: 2025-12-01 09:14:05.408200899 +0000 UTC m=+0.165445754 container remove 156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_jemison, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:05 np0005540741 systemd[1]: libpod-conmon-156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a.scope: Deactivated successfully.
Dec  1 04:14:05 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Dec  1 04:14:05 np0005540741 ceph-mon[75031]: Deploying daemon osd.0 on compute-0
Dec  1 04:14:05 np0005540741 podman[87645]: 2025-12-01 09:14:05.67294772 +0000 UTC m=+0.055293397 container create f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec  1 04:14:05 np0005540741 systemd[1]: Started libpod-conmon-f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d.scope.
Dec  1 04:14:05 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:05 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c5df66c4dda97f92248b2fc432ebcb9e977cd94c3b7d22619468b7e7b60bc43/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:05 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c5df66c4dda97f92248b2fc432ebcb9e977cd94c3b7d22619468b7e7b60bc43/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:05 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c5df66c4dda97f92248b2fc432ebcb9e977cd94c3b7d22619468b7e7b60bc43/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:05 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c5df66c4dda97f92248b2fc432ebcb9e977cd94c3b7d22619468b7e7b60bc43/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:05 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c5df66c4dda97f92248b2fc432ebcb9e977cd94c3b7d22619468b7e7b60bc43/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:05 np0005540741 podman[87645]: 2025-12-01 09:14:05.645932296 +0000 UTC m=+0.028278003 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:05 np0005540741 podman[87645]: 2025-12-01 09:14:05.740784137 +0000 UTC m=+0.123129914 container init f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:14:05 np0005540741 podman[87645]: 2025-12-01 09:14:05.748890004 +0000 UTC m=+0.131235721 container start f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:14:05 np0005540741 podman[87645]: 2025-12-01 09:14:05.756900349 +0000 UTC m=+0.139246056 container attach f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec  1 04:14:06 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test[87661]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Dec  1 04:14:06 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test[87661]:                            [--no-systemd] [--no-tmpfs]
Dec  1 04:14:06 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test[87661]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec  1 04:14:06 np0005540741 systemd[1]: libpod-f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d.scope: Deactivated successfully.
Dec  1 04:14:06 np0005540741 podman[87645]: 2025-12-01 09:14:06.461177917 +0000 UTC m=+0.843523604 container died f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:06 np0005540741 systemd[1]: var-lib-containers-storage-overlay-3c5df66c4dda97f92248b2fc432ebcb9e977cd94c3b7d22619468b7e7b60bc43-merged.mount: Deactivated successfully.
Dec  1 04:14:06 np0005540741 podman[87645]: 2025-12-01 09:14:06.525701984 +0000 UTC m=+0.908047671 container remove f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:06 np0005540741 systemd[1]: libpod-conmon-f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d.scope: Deactivated successfully.
Dec  1 04:14:06 np0005540741 systemd[1]: Reloading.
Dec  1 04:14:06 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:14:06 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:14:06 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v24: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:14:07 np0005540741 systemd[1]: Reloading.
Dec  1 04:14:07 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:14:07 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:14:07 np0005540741 systemd[1]: Starting Ceph osd.0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec  1 04:14:07 np0005540741 podman[87823]: 2025-12-01 09:14:07.509962465 +0000 UTC m=+0.047675165 container create 4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:07 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:07 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68b7957790c37820780a9cdb80437b209de3b565a5fd04c39aadc5c6cb832d70/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:07 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68b7957790c37820780a9cdb80437b209de3b565a5fd04c39aadc5c6cb832d70/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:07 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68b7957790c37820780a9cdb80437b209de3b565a5fd04c39aadc5c6cb832d70/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:07 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68b7957790c37820780a9cdb80437b209de3b565a5fd04c39aadc5c6cb832d70/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:07 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68b7957790c37820780a9cdb80437b209de3b565a5fd04c39aadc5c6cb832d70/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:07 np0005540741 podman[87823]: 2025-12-01 09:14:07.486812549 +0000 UTC m=+0.024525249 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:07 np0005540741 podman[87823]: 2025-12-01 09:14:07.592906373 +0000 UTC m=+0.130619083 container init 4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec  1 04:14:07 np0005540741 podman[87823]: 2025-12-01 09:14:07.600442943 +0000 UTC m=+0.138155643 container start 4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:14:07 np0005540741 podman[87823]: 2025-12-01 09:14:07.603732143 +0000 UTC m=+0.141444943 container attach 4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:14:07 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e6 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:14:08 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate[87838]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  1 04:14:08 np0005540741 bash[87823]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  1 04:14:08 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate[87838]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Dec  1 04:14:08 np0005540741 bash[87823]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Dec  1 04:14:08 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate[87838]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Dec  1 04:14:08 np0005540741 bash[87823]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Dec  1 04:14:08 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate[87838]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  1 04:14:08 np0005540741 bash[87823]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  1 04:14:08 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate[87838]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec  1 04:14:08 np0005540741 bash[87823]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec  1 04:14:08 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate[87838]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  1 04:14:08 np0005540741 bash[87823]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  1 04:14:08 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate[87838]: --> ceph-volume raw activate successful for osd ID: 0
Dec  1 04:14:08 np0005540741 bash[87823]: --> ceph-volume raw activate successful for osd ID: 0
Dec  1 04:14:08 np0005540741 systemd[1]: libpod-4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa.scope: Deactivated successfully.
Dec  1 04:14:08 np0005540741 podman[87823]: 2025-12-01 09:14:08.646513319 +0000 UTC m=+1.184226019 container died 4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True)
Dec  1 04:14:08 np0005540741 systemd[1]: libpod-4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa.scope: Consumed 1.058s CPU time.
Dec  1 04:14:08 np0005540741 systemd[1]: var-lib-containers-storage-overlay-68b7957790c37820780a9cdb80437b209de3b565a5fd04c39aadc5c6cb832d70-merged.mount: Deactivated successfully.
Dec  1 04:14:08 np0005540741 podman[87823]: 2025-12-01 09:14:08.696107551 +0000 UTC m=+1.233820251 container remove 4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec  1 04:14:08 np0005540741 podman[88028]: 2025-12-01 09:14:08.882165592 +0000 UTC m=+0.037619857 container create b27d497db5b169524d5d8a1837eab1f4cd862a203a76444a6067c9edec102279 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec  1 04:14:08 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7352c180718802d143b8e0ceafb347d0f8d7a91c0b12986f42a4ab00b2eea713/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:08 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7352c180718802d143b8e0ceafb347d0f8d7a91c0b12986f42a4ab00b2eea713/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:08 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7352c180718802d143b8e0ceafb347d0f8d7a91c0b12986f42a4ab00b2eea713/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:08 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7352c180718802d143b8e0ceafb347d0f8d7a91c0b12986f42a4ab00b2eea713/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:08 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7352c180718802d143b8e0ceafb347d0f8d7a91c0b12986f42a4ab00b2eea713/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:08 np0005540741 podman[88028]: 2025-12-01 09:14:08.941714557 +0000 UTC m=+0.097168852 container init b27d497db5b169524d5d8a1837eab1f4cd862a203a76444a6067c9edec102279 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  1 04:14:08 np0005540741 podman[88028]: 2025-12-01 09:14:08.947506214 +0000 UTC m=+0.102960479 container start b27d497db5b169524d5d8a1837eab1f4cd862a203a76444a6067c9edec102279 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  1 04:14:08 np0005540741 bash[88028]: b27d497db5b169524d5d8a1837eab1f4cd862a203a76444a6067c9edec102279
Dec  1 04:14:08 np0005540741 podman[88028]: 2025-12-01 09:14:08.865407951 +0000 UTC m=+0.020862246 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:08 np0005540741 systemd[1]: Started Ceph osd.0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec  1 04:14:08 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v25: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:14:08 np0005540741 ceph-osd[88047]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:14:08 np0005540741 ceph-osd[88047]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Dec  1 04:14:08 np0005540741 ceph-osd[88047]: pidfile_write: ignore empty --pid-file
Dec  1 04:14:08 np0005540741 ceph-osd[88047]: bdev(0x55c73633f800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:14:08 np0005540741 ceph-osd[88047]: bdev(0x55c73633f800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:14:08 np0005540741 ceph-osd[88047]: bdev(0x55c73633f800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:08 np0005540741 ceph-osd[88047]: bdev(0x55c73633f800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:08 np0005540741 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:14:08 np0005540741 ceph-osd[88047]: bdev(0x55c737181800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:14:08 np0005540741 ceph-osd[88047]: bdev(0x55c737181800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:14:08 np0005540741 ceph-osd[88047]: bdev(0x55c737181800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:08 np0005540741 ceph-osd[88047]: bdev(0x55c737181800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:08 np0005540741 ceph-osd[88047]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec  1 04:14:08 np0005540741 ceph-osd[88047]: bdev(0x55c737181800 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:14:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:14:09 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:14:09 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) v1
Dec  1 04:14:09 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Dec  1 04:14:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:14:09 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:14:09 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Deploying daemon osd.1 on compute-0
Dec  1 04:14:09 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Deploying daemon osd.1 on compute-0
Dec  1 04:14:09 np0005540741 ceph-osd[88047]: bdev(0x55c73633f800 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:14:09 np0005540741 ceph-osd[88047]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal
Dec  1 04:14:09 np0005540741 ceph-osd[88047]: load: jerasure load: lrc 
Dec  1 04:14:09 np0005540741 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:14:09 np0005540741 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:14:09 np0005540741 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:09 np0005540741 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:09 np0005540741 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:14:09 np0005540741 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:14:09 np0005540741 podman[88203]: 2025-12-01 09:14:09.550386701 +0000 UTC m=+0.039033181 container create bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_shaw, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:14:09 np0005540741 systemd[1]: Started libpod-conmon-bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f.scope.
Dec  1 04:14:09 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:09 np0005540741 podman[88203]: 2025-12-01 09:14:09.533838777 +0000 UTC m=+0.022485277 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:09 np0005540741 podman[88203]: 2025-12-01 09:14:09.644349385 +0000 UTC m=+0.132995885 container init bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_shaw, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:09 np0005540741 podman[88203]: 2025-12-01 09:14:09.650897645 +0000 UTC m=+0.139544125 container start bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_shaw, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:09 np0005540741 podman[88203]: 2025-12-01 09:14:09.653848815 +0000 UTC m=+0.142495295 container attach bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_shaw, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:09 np0005540741 busy_shaw[88224]: 167 167
Dec  1 04:14:09 np0005540741 systemd[1]: libpod-bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f.scope: Deactivated successfully.
Dec  1 04:14:09 np0005540741 podman[88203]: 2025-12-01 09:14:09.655875296 +0000 UTC m=+0.144521776 container died bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_shaw, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True)
Dec  1 04:14:09 np0005540741 systemd[1]: var-lib-containers-storage-overlay-713ca73f911a02f074e1d9102592b8b5c854294e32711516ee336769910d7809-merged.mount: Deactivated successfully.
Dec  1 04:14:09 np0005540741 podman[88203]: 2025-12-01 09:14:09.694732331 +0000 UTC m=+0.183378811 container remove bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_shaw, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  1 04:14:09 np0005540741 systemd[1]: libpod-conmon-bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f.scope: Deactivated successfully.
Dec  1 04:14:09 np0005540741 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:14:09 np0005540741 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:14:09 np0005540741 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:09 np0005540741 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:09 np0005540741 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:14:09 np0005540741 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:14:09 np0005540741 podman[88259]: 2025-12-01 09:14:09.939215833 +0000 UTC m=+0.048569721 container create a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:14:09 np0005540741 systemd[1]: Started libpod-conmon-a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25.scope.
Dec  1 04:14:09 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:09 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c21639ba04608201591abc3a56db0ee8dfc2c0ca08db065fb07b2e4a036ea93/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:09 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c21639ba04608201591abc3a56db0ee8dfc2c0ca08db065fb07b2e4a036ea93/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:09 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c21639ba04608201591abc3a56db0ee8dfc2c0ca08db065fb07b2e4a036ea93/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:09 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c21639ba04608201591abc3a56db0ee8dfc2c0ca08db065fb07b2e4a036ea93/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:10 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c21639ba04608201591abc3a56db0ee8dfc2c0ca08db065fb07b2e4a036ea93/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:10 np0005540741 podman[88259]: 2025-12-01 09:14:10.010360922 +0000 UTC m=+0.119714840 container init a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 04:14:10 np0005540741 podman[88259]: 2025-12-01 09:14:09.915533811 +0000 UTC m=+0.024887729 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:10 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:10 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:10 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Dec  1 04:14:10 np0005540741 ceph-mon[75031]: Deploying daemon osd.1 on compute-0
Dec  1 04:14:10 np0005540741 podman[88259]: 2025-12-01 09:14:10.017947023 +0000 UTC m=+0.127300911 container start a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec  1 04:14:10 np0005540741 podman[88259]: 2025-12-01 09:14:10.021412429 +0000 UTC m=+0.130766347 container attach a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluefs mount
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluefs mount shared_bdev_used = 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: RocksDB version: 7.9.2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Git sha 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Compile date 2025-05-06 23:30:25
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: DB SUMMARY
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: DB Session ID:  Z9IFZ8MDJU8BBS3TV8NA
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: CURRENT file:  CURRENT
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: IDENTITY file:  IDENTITY
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                         Options.error_if_exists: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.create_if_missing: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                         Options.paranoid_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                                     Options.env: 0x55c7371d3c70
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                                Options.info_log: 0x55c7363c68a0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_file_opening_threads: 16
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                              Options.statistics: (nil)
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.use_fsync: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.max_log_file_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                         Options.allow_fallocate: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.use_direct_reads: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.create_missing_column_families: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                              Options.db_log_dir: 
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                                 Options.wal_dir: db.wal
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.advise_random_on_open: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.write_buffer_manager: 0x55c7372dc460
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                            Options.rate_limiter: (nil)
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.unordered_write: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.row_cache: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                              Options.wal_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.allow_ingest_behind: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.two_write_queues: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.manual_wal_flush: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.wal_compression: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.atomic_flush: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.log_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.allow_data_in_errors: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.db_host_id: __hostname__
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.max_background_jobs: 4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.max_background_compactions: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.max_subcompactions: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.max_open_files: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.bytes_per_sync: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.max_background_flushes: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Compression algorithms supported:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kZSTD supported: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kXpressCompression supported: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kBZip2Compression supported: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kLZ4Compression supported: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kZlibCompression supported: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kSnappyCompression supported: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c62c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b31f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c62c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b31f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c62c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b31f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c62c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b31f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c62c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b31f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c62c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b31f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c62c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b31f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c6240)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b3090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c6240)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b3090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c6240)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b3090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 801eb657-3ccc-48c9-95d8-faada9292b70
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580450082975, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580450083179, "job": 1, "event": "recovery_finished"}
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: freelist init
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: freelist _read_cfg
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluefs umount
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluefs mount
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluefs mount shared_bdev_used = 4718592
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: RocksDB version: 7.9.2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Git sha 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Compile date 2025-05-06 23:30:25
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: DB SUMMARY
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: DB Session ID:  Z9IFZ8MDJU8BBS3TV8NB
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: CURRENT file:  CURRENT
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: IDENTITY file:  IDENTITY
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                         Options.error_if_exists: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.create_if_missing: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                         Options.paranoid_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                                     Options.env: 0x55c7371d3f10
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                                Options.info_log: 0x55c7363c6360
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_file_opening_threads: 16
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                              Options.statistics: (nil)
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.use_fsync: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.max_log_file_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                         Options.allow_fallocate: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.use_direct_reads: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.create_missing_column_families: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                              Options.db_log_dir: 
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                                 Options.wal_dir: db.wal
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.advise_random_on_open: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.write_buffer_manager: 0x55c7372dc6e0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                            Options.rate_limiter: (nil)
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.unordered_write: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.row_cache: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                              Options.wal_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.allow_ingest_behind: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.two_write_queues: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.manual_wal_flush: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.wal_compression: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.atomic_flush: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.log_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.allow_data_in_errors: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.db_host_id: __hostname__
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.max_background_jobs: 4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.max_background_compactions: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.max_subcompactions: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.max_open_files: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.bytes_per_sync: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.max_background_flushes: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Compression algorithms supported:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kZSTD supported: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kXpressCompression supported: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kBZip2Compression supported: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kLZ4Compression supported: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kZlibCompression supported: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: #011kSnappyCompression supported: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf560)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b31f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf560)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b31f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf560)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b31f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf560)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b31f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf560)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b31f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf560)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b31f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf560)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b31f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf580)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b3090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf580)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b3090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf580)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c7363b3090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 801eb657-3ccc-48c9-95d8-faada9292b70
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580450360067, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580450364881, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580450, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "801eb657-3ccc-48c9-95d8-faada9292b70", "db_session_id": "Z9IFZ8MDJU8BBS3TV8NB", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580450367819, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580450, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "801eb657-3ccc-48c9-95d8-faada9292b70", "db_session_id": "Z9IFZ8MDJU8BBS3TV8NB", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580450370457, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580450, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "801eb657-3ccc-48c9-95d8-faada9292b70", "db_session_id": "Z9IFZ8MDJU8BBS3TV8NB", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580450371782, "job": 1, "event": "recovery_finished"}
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55c736521c00
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: DB pointer 0x55c7372c5a00
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: _get_class not permitted to load lua
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: _get_class not permitted to load sdk
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: _get_class not permitted to load test_remote_reads
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: osd.0 0 load_pgs
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: osd.0 0 load_pgs opened 0 pgs
Dec  1 04:14:10 np0005540741 ceph-osd[88047]: osd.0 0 log_to_monitors true
Dec  1 04:14:10 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0[88043]: 2025-12-01T09:14:10.414+0000 7f107cb51740 -1 osd.0 0 log_to_monitors true
Dec  1 04:14:10 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]} v 0) v1
Dec  1 04:14:10 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Dec  1 04:14:10 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test[88275]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Dec  1 04:14:10 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test[88275]:                            [--no-systemd] [--no-tmpfs]
Dec  1 04:14:10 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test[88275]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec  1 04:14:10 np0005540741 systemd[1]: libpod-a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25.scope: Deactivated successfully.
Dec  1 04:14:10 np0005540741 podman[88259]: 2025-12-01 09:14:10.74387125 +0000 UTC m=+0.853225148 container died a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec  1 04:14:10 np0005540741 systemd[1]: var-lib-containers-storage-overlay-3c21639ba04608201591abc3a56db0ee8dfc2c0ca08db065fb07b2e4a036ea93-merged.mount: Deactivated successfully.
Dec  1 04:14:10 np0005540741 podman[88259]: 2025-12-01 09:14:10.795190094 +0000 UTC m=+0.904544002 container remove a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:10 np0005540741 systemd[1]: libpod-conmon-a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25.scope: Deactivated successfully.
Dec  1 04:14:10 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v26: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:14:11 np0005540741 systemd[1]: Reloading.
Dec  1 04:14:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e6 do_prune osdmap full prune enabled
Dec  1 04:14:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e6 encode_pending skipping prime_pg_temp; mapping job did not start
Dec  1 04:14:11 np0005540741 ceph-mon[75031]: from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Dec  1 04:14:11 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Dec  1 04:14:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e7 e7: 3 total, 0 up, 3 in
Dec  1 04:14:11 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e7: 3 total, 0 up, 3 in
Dec  1 04:14:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]} v 0) v1
Dec  1 04:14:11 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Dec  1 04:14:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e7 create-or-move crush item name 'osd.0' initial_weight 0.0195 at location {host=compute-0,root=default}
Dec  1 04:14:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:14:11 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:14:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:11 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:11 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:11 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec  1 04:14:11 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:11 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:11 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:14:11 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:14:11 np0005540741 systemd[1]: Reloading.
Dec  1 04:14:11 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:14:11 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:14:11 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec  1 04:14:11 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec  1 04:14:11 np0005540741 systemd[1]: Starting Ceph osd.1 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec  1 04:14:11 np0005540741 podman[88843]: 2025-12-01 09:14:11.774458934 +0000 UTC m=+0.038693380 container create 6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:11 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:11 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0c3eff9402a8799d4f2c5353443051a495ce202894b25216d82b35e900c90b1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:11 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0c3eff9402a8799d4f2c5353443051a495ce202894b25216d82b35e900c90b1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:11 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0c3eff9402a8799d4f2c5353443051a495ce202894b25216d82b35e900c90b1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:11 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0c3eff9402a8799d4f2c5353443051a495ce202894b25216d82b35e900c90b1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:11 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0c3eff9402a8799d4f2c5353443051a495ce202894b25216d82b35e900c90b1/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:11 np0005540741 podman[88843]: 2025-12-01 09:14:11.837584088 +0000 UTC m=+0.101818554 container init 6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:14:11 np0005540741 podman[88843]: 2025-12-01 09:14:11.848859082 +0000 UTC m=+0.113093528 container start 6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  1 04:14:11 np0005540741 podman[88843]: 2025-12-01 09:14:11.852637127 +0000 UTC m=+0.116871573 container attach 6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:11 np0005540741 podman[88843]: 2025-12-01 09:14:11.757723434 +0000 UTC m=+0.021957900 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e7 do_prune osdmap full prune enabled
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e7 encode_pending skipping prime_pg_temp; mapping job did not start
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e8 e8: 3 total, 0 up, 3 in
Dec  1 04:14:12 np0005540741 ceph-osd[88047]: osd.0 0 done with init, starting boot process
Dec  1 04:14:12 np0005540741 ceph-osd[88047]: osd.0 0 start_boot
Dec  1 04:14:12 np0005540741 ceph-osd[88047]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec  1 04:14:12 np0005540741 ceph-osd[88047]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec  1 04:14:12 np0005540741 ceph-osd[88047]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec  1 04:14:12 np0005540741 ceph-osd[88047]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec  1 04:14:12 np0005540741 ceph-osd[88047]: osd.0 0  bench count 12288000 bsize 4 KiB
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e8: 3 total, 0 up, 3 in
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:12 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec  1 04:14:12 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Dec  1 04:14:12 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:12 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:14:12 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec  1 04:14:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e8 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:14:12 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate[88859]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec  1 04:14:12 np0005540741 bash[88843]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec  1 04:14:12 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate[88859]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Dec  1 04:14:12 np0005540741 bash[88843]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Dec  1 04:14:12 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v29: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:14:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:14:12
Dec  1 04:14:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:14:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:14:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] No pools available
Dec  1 04:14:12 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate[88859]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Dec  1 04:14:12 np0005540741 bash[88843]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Dec  1 04:14:12 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate[88859]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec  1 04:14:12 np0005540741 bash[88843]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec  1 04:14:12 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate[88859]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-1/block
Dec  1 04:14:12 np0005540741 bash[88843]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-1/block
Dec  1 04:14:12 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate[88859]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec  1 04:14:12 np0005540741 bash[88843]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec  1 04:14:12 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate[88859]: --> ceph-volume raw activate successful for osd ID: 1
Dec  1 04:14:12 np0005540741 bash[88843]: --> ceph-volume raw activate successful for osd ID: 1
Dec  1 04:14:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:14:13 np0005540741 systemd[1]: libpod-6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac.scope: Deactivated successfully.
Dec  1 04:14:13 np0005540741 systemd[1]: libpod-6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac.scope: Consumed 1.197s CPU time.
Dec  1 04:14:13 np0005540741 podman[88843]: 2025-12-01 09:14:13.032653426 +0000 UTC m=+1.296887872 container died 6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:14:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:14:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:14:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:14:13 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec  1 04:14:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:14:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:14:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:14:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:14:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:14:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:14:13 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec  1 04:14:13 np0005540741 ceph-mon[75031]: from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec  1 04:14:13 np0005540741 systemd[1]: var-lib-containers-storage-overlay-f0c3eff9402a8799d4f2c5353443051a495ce202894b25216d82b35e900c90b1-merged.mount: Deactivated successfully.
Dec  1 04:14:13 np0005540741 podman[88843]: 2025-12-01 09:14:13.154618584 +0000 UTC m=+1.418853030 container remove 6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Dec  1 04:14:13 np0005540741 podman[89033]: 2025-12-01 09:14:13.341977375 +0000 UTC m=+0.022108915 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:13 np0005540741 podman[89033]: 2025-12-01 09:14:13.442631553 +0000 UTC m=+0.122763083 container create 2203330e3b4c084a5051630983687321ba42178f6b21acc3ece7e642356ce8a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507)
Dec  1 04:14:13 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57d2f59c1ffba8320e050daaf2d47de0e88136e8bfe81b0b3234263632cbdfd9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:13 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57d2f59c1ffba8320e050daaf2d47de0e88136e8bfe81b0b3234263632cbdfd9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:13 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57d2f59c1ffba8320e050daaf2d47de0e88136e8bfe81b0b3234263632cbdfd9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:13 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57d2f59c1ffba8320e050daaf2d47de0e88136e8bfe81b0b3234263632cbdfd9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:13 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57d2f59c1ffba8320e050daaf2d47de0e88136e8bfe81b0b3234263632cbdfd9/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:13 np0005540741 podman[89033]: 2025-12-01 09:14:13.665161245 +0000 UTC m=+0.345292775 container init 2203330e3b4c084a5051630983687321ba42178f6b21acc3ece7e642356ce8a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True)
Dec  1 04:14:13 np0005540741 podman[89033]: 2025-12-01 09:14:13.671740466 +0000 UTC m=+0.351871986 container start 2203330e3b4c084a5051630983687321ba42178f6b21acc3ece7e642356ce8a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec  1 04:14:13 np0005540741 ceph-osd[89052]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:14:13 np0005540741 ceph-osd[89052]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Dec  1 04:14:13 np0005540741 ceph-osd[89052]: pidfile_write: ignore empty --pid-file
Dec  1 04:14:13 np0005540741 ceph-osd[89052]: bdev(0x555f190d5800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec  1 04:14:13 np0005540741 ceph-osd[89052]: bdev(0x555f190d5800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec  1 04:14:13 np0005540741 ceph-osd[89052]: bdev(0x555f190d5800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:13 np0005540741 ceph-osd[89052]: bdev(0x555f190d5800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:13 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:14:13 np0005540741 ceph-osd[89052]: bdev(0x555f19f0d800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec  1 04:14:13 np0005540741 ceph-osd[89052]: bdev(0x555f19f0d800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec  1 04:14:13 np0005540741 ceph-osd[89052]: bdev(0x555f19f0d800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:13 np0005540741 ceph-osd[89052]: bdev(0x555f19f0d800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:13 np0005540741 ceph-osd[89052]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Dec  1 04:14:13 np0005540741 ceph-osd[89052]: bdev(0x555f19f0d800 /var/lib/ceph/osd/ceph-1/block) close
Dec  1 04:14:13 np0005540741 bash[89033]: 2203330e3b4c084a5051630983687321ba42178f6b21acc3ece7e642356ce8a9
Dec  1 04:14:13 np0005540741 systemd[1]: Started Ceph osd.1 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec  1 04:14:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:14:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:14:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) v1
Dec  1 04:14:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Dec  1 04:14:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:14:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:14:13 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Deploying daemon osd.2 on compute-0
Dec  1 04:14:13 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Deploying daemon osd.2 on compute-0
Dec  1 04:14:13 np0005540741 ceph-osd[89052]: bdev(0x555f190d5800 /var/lib/ceph/osd/ceph-1/block) close
Dec  1 04:14:14 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec  1 04:14:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:14:14 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:14:14 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec  1 04:14:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Dec  1 04:14:14 np0005540741 ceph-mon[75031]: Deploying daemon osd.2 on compute-0
Dec  1 04:14:14 np0005540741 ceph-osd[89052]: starting osd.1 osd_data /var/lib/ceph/osd/ceph-1 /var/lib/ceph/osd/ceph-1/journal
Dec  1 04:14:14 np0005540741 ceph-osd[89052]: load: jerasure load: lrc 
Dec  1 04:14:14 np0005540741 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec  1 04:14:14 np0005540741 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec  1 04:14:14 np0005540741 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:14 np0005540741 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:14 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:14:14 np0005540741 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) close
Dec  1 04:14:14 np0005540741 podman[89215]: 2025-12-01 09:14:14.744373712 +0000 UTC m=+0.239929594 container create 0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_heisenberg, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True)
Dec  1 04:14:14 np0005540741 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec  1 04:14:14 np0005540741 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec  1 04:14:14 np0005540741 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:14 np0005540741 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:14 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:14:14 np0005540741 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) close
Dec  1 04:14:14 np0005540741 systemd[1]: Started libpod-conmon-0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980.scope.
Dec  1 04:14:14 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:14 np0005540741 podman[89215]: 2025-12-01 09:14:14.724664351 +0000 UTC m=+0.220220253 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:14 np0005540741 podman[89215]: 2025-12-01 09:14:14.832078586 +0000 UTC m=+0.327634488 container init 0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_heisenberg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:14 np0005540741 podman[89215]: 2025-12-01 09:14:14.838819741 +0000 UTC m=+0.334375623 container start 0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_heisenberg, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Dec  1 04:14:14 np0005540741 flamboyant_heisenberg[89236]: 167 167
Dec  1 04:14:14 np0005540741 systemd[1]: libpod-0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980.scope: Deactivated successfully.
Dec  1 04:14:14 np0005540741 podman[89215]: 2025-12-01 09:14:14.863947797 +0000 UTC m=+0.359503699 container attach 0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_heisenberg, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec  1 04:14:14 np0005540741 podman[89215]: 2025-12-01 09:14:14.864355359 +0000 UTC m=+0.359911241 container died 0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_heisenberg, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec  1 04:14:14 np0005540741 systemd[1]: var-lib-containers-storage-overlay-0e776f518f3fd9a519057cca5037d204571739e1cf2e78b5472cd05f523adbf7-merged.mount: Deactivated successfully.
Dec  1 04:14:14 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v30: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: osd.1:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluefs mount
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec  1 04:14:15 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluefs mount shared_bdev_used = 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: RocksDB version: 7.9.2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Git sha 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Compile date 2025-05-06 23:30:25
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: DB SUMMARY
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: DB Session ID:  NR1ZS73OTAFE67X1H6FN
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: CURRENT file:  CURRENT
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: IDENTITY file:  IDENTITY
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                         Options.error_if_exists: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.create_if_missing: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                         Options.paranoid_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                                     Options.env: 0x555f19f5fc70
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                                Options.info_log: 0x555f1915c8a0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_file_opening_threads: 16
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                              Options.statistics: (nil)
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.use_fsync: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.max_log_file_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                         Options.allow_fallocate: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.use_direct_reads: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.create_missing_column_families: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                              Options.db_log_dir: 
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                                 Options.wal_dir: db.wal
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.advise_random_on_open: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.write_buffer_manager: 0x555f1a068460
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                            Options.rate_limiter: (nil)
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.unordered_write: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.row_cache: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                              Options.wal_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.allow_ingest_behind: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.two_write_queues: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.manual_wal_flush: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.wal_compression: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.atomic_flush: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.log_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.allow_data_in_errors: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.db_host_id: __hostname__
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.max_background_jobs: 4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.max_background_compactions: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.max_subcompactions: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.max_open_files: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.bytes_per_sync: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.max_background_flushes: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Compression algorithms supported:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kZSTD supported: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kXpressCompression supported: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kBZip2Compression supported: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kLZ4Compression supported: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kZlibCompression supported: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kSnappyCompression supported: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c2c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f191491f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c2c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f191491f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec  1 04:14:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:14:15 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:14:15 np0005540741 podman[89215]: 2025-12-01 09:14:15.091161403 +0000 UTC m=+0.586717285 container remove 0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_heisenberg, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c2c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f191491f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c2c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f191491f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 systemd[1]: libpod-conmon-0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980.scope: Deactivated successfully.
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c2c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f191491f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c2c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f191491f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c2c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f191491f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c240)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f19149090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c240)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f19149090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c240)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f19149090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 5fcb8a6d-e992-4698-8305-75c619417288
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580455130933, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580455131142, "job": 1, "event": "recovery_finished"}
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old nid_max 1025
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old blobid_max 10240
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta min_alloc_size 0x1000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: freelist init
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: freelist _read_cfg
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluefs umount
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) close
Dec  1 04:14:15 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluefs mount
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluefs mount shared_bdev_used = 4718592
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: RocksDB version: 7.9.2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Git sha 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Compile date 2025-05-06 23:30:25
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: DB SUMMARY
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: DB Session ID:  NR1ZS73OTAFE67X1H6FM
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: CURRENT file:  CURRENT
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: IDENTITY file:  IDENTITY
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                         Options.error_if_exists: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.create_if_missing: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                         Options.paranoid_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                                     Options.env: 0x555f1a110380
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                                Options.info_log: 0x555f19153280
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_file_opening_threads: 16
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                              Options.statistics: (nil)
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.use_fsync: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.max_log_file_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                         Options.allow_fallocate: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.use_direct_reads: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.create_missing_column_families: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                              Options.db_log_dir: 
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                                 Options.wal_dir: db.wal
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.advise_random_on_open: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.write_buffer_manager: 0x555f1a0686e0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                            Options.rate_limiter: (nil)
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.unordered_write: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.row_cache: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                              Options.wal_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.allow_ingest_behind: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.two_write_queues: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.manual_wal_flush: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.wal_compression: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.atomic_flush: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.log_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.allow_data_in_errors: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.db_host_id: __hostname__
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.max_background_jobs: 4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.max_background_compactions: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.max_subcompactions: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.max_open_files: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.bytes_per_sync: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.max_background_flushes: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Compression algorithms supported:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kZSTD supported: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kXpressCompression supported: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kBZip2Compression supported: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kLZ4Compression supported: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kZlibCompression supported: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: #011kSnappyCompression supported: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1912fc80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f191491f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1912fc80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f191491f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1912fc80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f191491f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1912fc80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f191491f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1912fc80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f191491f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1912fc80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f191491f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1912fc80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f191491f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f19153840)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f19149090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f19153840)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f19149090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f19153840)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555f19149090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 5fcb8a6d-e992-4698-8305-75c619417288
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580455341591, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580455395271, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580455, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fcb8a6d-e992-4698-8305-75c619417288", "db_session_id": "NR1ZS73OTAFE67X1H6FM", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580455398873, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580455, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fcb8a6d-e992-4698-8305-75c619417288", "db_session_id": "NR1ZS73OTAFE67X1H6FM", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580455401271, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580455, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fcb8a6d-e992-4698-8305-75c619417288", "db_session_id": "NR1ZS73OTAFE67X1H6FM", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580455402883, "job": 1, "event": "recovery_finished"}
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec  1 04:14:15 np0005540741 podman[89644]: 2025-12-01 09:14:15.440218553 +0000 UTC m=+0.070782859 container create ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x555f192b7c00
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: DB pointer 0x555f1a051a00
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super from 4, latest 4
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super done
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.2 total, 0.2 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.2 total, 0.2 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.2 total, 0.2 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.2 total, 0.2 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 460.80 MB usag
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: _get_class not permitted to load lua
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: _get_class not permitted to load sdk
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: _get_class not permitted to load test_remote_reads
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: osd.1 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: osd.1 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: osd.1 0 load_pgs
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: osd.1 0 load_pgs opened 0 pgs
Dec  1 04:14:15 np0005540741 ceph-osd[89052]: osd.1 0 log_to_monitors true
Dec  1 04:14:15 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1[89048]: 2025-12-01T09:14:15.480+0000 7f11a5e1b740 -1 osd.1 0 log_to_monitors true
Dec  1 04:14:15 np0005540741 podman[89644]: 2025-12-01 09:14:15.394084977 +0000 UTC m=+0.024649283 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]} v 0) v1
Dec  1 04:14:15 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Dec  1 04:14:15 np0005540741 systemd[1]: Started libpod-conmon-ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e.scope.
Dec  1 04:14:15 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d68077d642ca5a1114cd317028679bfdfa8a23b8e06cc700eadf92f64ec9e8dd/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d68077d642ca5a1114cd317028679bfdfa8a23b8e06cc700eadf92f64ec9e8dd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d68077d642ca5a1114cd317028679bfdfa8a23b8e06cc700eadf92f64ec9e8dd/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d68077d642ca5a1114cd317028679bfdfa8a23b8e06cc700eadf92f64ec9e8dd/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d68077d642ca5a1114cd317028679bfdfa8a23b8e06cc700eadf92f64ec9e8dd/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:15 np0005540741 podman[89644]: 2025-12-01 09:14:15.573428783 +0000 UTC m=+0.203993109 container init ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  1 04:14:15 np0005540741 podman[89644]: 2025-12-01 09:14:15.582446928 +0000 UTC m=+0.213011234 container start ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507)
Dec  1 04:14:15 np0005540741 podman[89644]: 2025-12-01 09:14:15.598951241 +0000 UTC m=+0.229515547 container attach ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec  1 04:14:16 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:14:16 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e8 do_prune osdmap full prune enabled
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e8 encode_pending skipping prime_pg_temp; mapping job did not start
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e9 e9: 3 total, 0 up, 3 in
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e9: 3 total, 0 up, 3 in
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:16 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec  1 04:14:16 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:16 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]} v 0) v1
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Dec  1 04:14:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e9 create-or-move crush item name 'osd.1' initial_weight 0.0195 at location {host=compute-0,root=default}
Dec  1 04:14:16 np0005540741 python3[89724]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   status --format json | jq .osdmap.num_up_osds _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:14:16 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec  1 04:14:16 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec  1 04:14:16 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test[89694]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Dec  1 04:14:16 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test[89694]:                            [--no-systemd] [--no-tmpfs]
Dec  1 04:14:16 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test[89694]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec  1 04:14:16 np0005540741 systemd[1]: libpod-ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e.scope: Deactivated successfully.
Dec  1 04:14:16 np0005540741 systemd[1]: libpod-ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e.scope: Consumed 1.083s CPU time.
Dec  1 04:14:16 np0005540741 podman[89726]: 2025-12-01 09:14:16.834551315 +0000 UTC m=+0.246097233 container create 5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695 (image=quay.io/ceph/ceph:v18, name=amazing_leavitt, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec  1 04:14:16 np0005540741 podman[89726]: 2025-12-01 09:14:16.740269451 +0000 UTC m=+0.151815389 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:14:16 np0005540741 podman[89644]: 2025-12-01 09:14:16.873248295 +0000 UTC m=+1.503812601 container died ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  1 04:14:16 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v32: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:14:17 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:14:17 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec  1 04:14:17 np0005540741 systemd[1]: Started libpod-conmon-5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695.scope.
Dec  1 04:14:17 np0005540741 systemd[1]: var-lib-containers-storage-overlay-d68077d642ca5a1114cd317028679bfdfa8a23b8e06cc700eadf92f64ec9e8dd-merged.mount: Deactivated successfully.
Dec  1 04:14:17 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e53e3256469721831714918dd626b9cf4dc03de3463be90f34cbdd78ed981626/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e53e3256469721831714918dd626b9cf4dc03de3463be90f34cbdd78ed981626/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e53e3256469721831714918dd626b9cf4dc03de3463be90f34cbdd78ed981626/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:17 np0005540741 podman[89644]: 2025-12-01 09:14:17.094076306 +0000 UTC m=+1.724640612 container remove ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec  1 04:14:17 np0005540741 systemd[1]: libpod-conmon-ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e.scope: Deactivated successfully.
Dec  1 04:14:17 np0005540741 podman[89726]: 2025-12-01 09:14:17.117156179 +0000 UTC m=+0.528702127 container init 5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695 (image=quay.io/ceph/ceph:v18, name=amazing_leavitt, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 04:14:17 np0005540741 podman[89726]: 2025-12-01 09:14:17.154871149 +0000 UTC m=+0.566417057 container start 5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695 (image=quay.io/ceph/ceph:v18, name=amazing_leavitt, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec  1 04:14:17 np0005540741 podman[89726]: 2025-12-01 09:14:17.16705348 +0000 UTC m=+0.578599428 container attach 5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695 (image=quay.io/ceph/ceph:v18, name=amazing_leavitt, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e9 do_prune osdmap full prune enabled
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e9 encode_pending skipping prime_pg_temp; mapping job did not start
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e10 e10: 3 total, 0 up, 3 in
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e10: 3 total, 0 up, 3 in
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:17 np0005540741 ceph-osd[89052]: osd.1 0 done with init, starting boot process
Dec  1 04:14:17 np0005540741 ceph-osd[89052]: osd.1 0 start_boot
Dec  1 04:14:17 np0005540741 ceph-osd[89052]: osd.1 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec  1 04:14:17 np0005540741 ceph-osd[89052]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec  1 04:14:17 np0005540741 ceph-osd[89052]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec  1 04:14:17 np0005540741 ceph-osd[89052]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec  1 04:14:17 np0005540741 ceph-osd[89052]: osd.1 0  bench count 12288000 bsize 4 KiB
Dec  1 04:14:17 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec  1 04:14:17 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:17 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:17 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:17 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:17 np0005540741 systemd[1]: Reloading.
Dec  1 04:14:17 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:14:17 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:14:17 np0005540741 systemd[1]: Reloading.
Dec  1 04:14:17 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:14:17 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:14:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e10 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:14:18 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec  1 04:14:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:14:18 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:14:18 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec  1 04:14:18 np0005540741 systemd[1]: Starting Ceph osd.2 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec  1 04:14:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json"} v 0) v1
Dec  1 04:14:18 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2893147319' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec  1 04:14:18 np0005540741 amazing_leavitt[89757]: 
Dec  1 04:14:18 np0005540741 amazing_leavitt[89757]: {"fsid":"5620a9fb-e540-5250-a0e8-7aaad5347e3b","health":{"status":"HEALTH_OK","checks":{},"mutes":[]},"election_epoch":5,"quorum":[0],"quorum_names":["compute-0"],"quorum_age":115,"monmap":{"epoch":1,"min_mon_release_name":"reef","num_mons":1},"osdmap":{"epoch":10,"num_osds":3,"num_up_osds":0,"osd_up_since":0,"num_in_osds":3,"osd_in_since":1764580437,"num_remapped_pgs":0},"pgmap":{"pgs_by_state":[],"num_pgs":0,"num_pools":0,"num_objects":0,"data_bytes":0,"bytes_used":0,"bytes_avail":0,"bytes_total":0},"fsmap":{"epoch":1,"by_rank":[],"up:standby":0},"mgrmap":{"available":true,"num_standbys":0,"modules":["cephadm","iostat","nfs","restful"],"services":{}},"servicemap":{"epoch":2,"modified":"2025-12-01T09:14:14.959091+0000","services":{}},"progress_events":{}}
Dec  1 04:14:18 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec  1 04:14:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:18 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:18 np0005540741 ceph-mon[75031]: from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec  1 04:14:18 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:18 np0005540741 systemd[1]: libpod-5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695.scope: Deactivated successfully.
Dec  1 04:14:18 np0005540741 systemd[1]: libpod-5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695.scope: Consumed 1.141s CPU time.
Dec  1 04:14:18 np0005540741 podman[89726]: 2025-12-01 09:14:18.314320941 +0000 UTC m=+1.725866889 container died 5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695 (image=quay.io/ceph/ceph:v18, name=amazing_leavitt, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec  1 04:14:18 np0005540741 systemd[1]: var-lib-containers-storage-overlay-e53e3256469721831714918dd626b9cf4dc03de3463be90f34cbdd78ed981626-merged.mount: Deactivated successfully.
Dec  1 04:14:18 np0005540741 podman[89726]: 2025-12-01 09:14:18.876181887 +0000 UTC m=+2.287727805 container remove 5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695 (image=quay.io/ceph/ceph:v18, name=amazing_leavitt, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 04:14:18 np0005540741 systemd[1]: libpod-conmon-5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695.scope: Deactivated successfully.
Dec  1 04:14:18 np0005540741 podman[89934]: 2025-12-01 09:14:18.930584016 +0000 UTC m=+0.507652036 container create 0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:14:18 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v34: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec  1 04:14:18 np0005540741 podman[89934]: 2025-12-01 09:14:18.898149917 +0000 UTC m=+0.475217957 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:18 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a7370faf47ecc46d2f5cdc94c3e09cfe3b4487f61929d197202eb934212d00a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:19 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a7370faf47ecc46d2f5cdc94c3e09cfe3b4487f61929d197202eb934212d00a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:19 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a7370faf47ecc46d2f5cdc94c3e09cfe3b4487f61929d197202eb934212d00a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:19 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a7370faf47ecc46d2f5cdc94c3e09cfe3b4487f61929d197202eb934212d00a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:19 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a7370faf47ecc46d2f5cdc94c3e09cfe3b4487f61929d197202eb934212d00a/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:19 np0005540741 podman[89934]: 2025-12-01 09:14:19.031804321 +0000 UTC m=+0.608872371 container init 0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:19 np0005540741 podman[89934]: 2025-12-01 09:14:19.037947128 +0000 UTC m=+0.615015148 container start 0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:14:19 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec  1 04:14:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:14:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:14:19 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec  1 04:14:19 np0005540741 podman[89934]: 2025-12-01 09:14:19.061739544 +0000 UTC m=+0.638807564 container attach 0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True)
Dec  1 04:14:19 np0005540741 ceph-osd[88047]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 6.477 iops: 1658.093 elapsed_sec: 1.809
Dec  1 04:14:19 np0005540741 ceph-osd[88047]: log_channel(cluster) log [WRN] : OSD bench result of 1658.093265 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec  1 04:14:19 np0005540741 ceph-osd[88047]: osd.0 0 waiting for initial osdmap
Dec  1 04:14:19 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0[88043]: 2025-12-01T09:14:19.119+0000 7f1078ad1640 -1 osd.0 0 waiting for initial osdmap
Dec  1 04:14:19 np0005540741 ceph-osd[88047]: osd.0 10 crush map has features 288514050185494528, adjusting msgr requires for clients
Dec  1 04:14:19 np0005540741 ceph-osd[88047]: osd.0 10 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Dec  1 04:14:19 np0005540741 ceph-osd[88047]: osd.0 10 crush map has features 3314932999778484224, adjusting msgr requires for osds
Dec  1 04:14:19 np0005540741 ceph-osd[88047]: osd.0 10 check_osdmap_features require_osd_release unknown -> reef
Dec  1 04:14:19 np0005540741 ceph-osd[88047]: osd.0 10 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec  1 04:14:19 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0[88043]: 2025-12-01T09:14:19.209+0000 7f10740f9640 -1 osd.0 10 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec  1 04:14:19 np0005540741 ceph-osd[88047]: osd.0 10 set_numa_affinity not setting numa affinity
Dec  1 04:14:19 np0005540741 ceph-osd[88047]: osd.0 10 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Dec  1 04:14:19 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec  1 04:14:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e10 do_prune osdmap full prune enabled
Dec  1 04:14:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e10 encode_pending skipping prime_pg_temp; mapping job did not start
Dec  1 04:14:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e11 e11: 3 total, 1 up, 3 in
Dec  1 04:14:19 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853] boot
Dec  1 04:14:19 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e11: 3 total, 1 up, 3 in
Dec  1 04:14:19 np0005540741 ceph-osd[88047]: osd.0 11 state: booting -> active
Dec  1 04:14:19 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec  1 04:14:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec  1 04:14:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:19 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:20 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec  1 04:14:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:20 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:20 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:20 np0005540741 ceph-mon[75031]: OSD bench result of 1658.093265 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec  1 04:14:20 np0005540741 ceph-mon[75031]: osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853] boot
Dec  1 04:14:20 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v36: 0 pgs: ; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec  1 04:14:21 np0005540741 ceph-mgr[75324]: [devicehealth INFO root] creating mgr pool
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true} v 0) v1
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch
Dec  1 04:14:21 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate[89951]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  1 04:14:21 np0005540741 bash[89934]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  1 04:14:21 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate[89951]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg2-ceph_lv2
Dec  1 04:14:21 np0005540741 bash[89934]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg2-ceph_lv2
Dec  1 04:14:21 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate[89951]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg2-ceph_lv2
Dec  1 04:14:21 np0005540741 bash[89934]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg2-ceph_lv2
Dec  1 04:14:21 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate[89951]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2
Dec  1 04:14:21 np0005540741 bash[89934]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2
Dec  1 04:14:21 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:21 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:21 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate[89951]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg2-ceph_lv2 /var/lib/ceph/osd/ceph-2/block
Dec  1 04:14:21 np0005540741 bash[89934]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg2-ceph_lv2 /var/lib/ceph/osd/ceph-2/block
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e11 do_prune osdmap full prune enabled
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e11 encode_pending skipping prime_pg_temp; mapping job did not start
Dec  1 04:14:21 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate[89951]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  1 04:14:21 np0005540741 bash[89934]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  1 04:14:21 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate[89951]: --> ceph-volume raw activate successful for osd ID: 2
Dec  1 04:14:21 np0005540741 bash[89934]: --> ceph-volume raw activate successful for osd ID: 2
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch
Dec  1 04:14:21 np0005540741 systemd[1]: libpod-0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4.scope: Deactivated successfully.
Dec  1 04:14:21 np0005540741 podman[89934]: 2025-12-01 09:14:21.836238885 +0000 UTC m=+3.413306905 container died 0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:14:21 np0005540741 systemd[1]: libpod-0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4.scope: Consumed 2.860s CPU time.
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e12 e12: 3 total, 1 up, 3 in
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e12 crush map has features 3314933000852226048, adjusting msgr requires
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e12: 3 total, 1 up, 3 in
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:21 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:21 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true} v 0) v1
Dec  1 04:14:21 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch
Dec  1 04:14:21 np0005540741 ceph-osd[88047]: osd.0 12 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec  1 04:14:21 np0005540741 ceph-osd[88047]: osd.0 12 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Dec  1 04:14:21 np0005540741 ceph-osd[88047]: osd.0 12 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec  1 04:14:21 np0005540741 systemd[1]: var-lib-containers-storage-overlay-1a7370faf47ecc46d2f5cdc94c3e09cfe3b4487f61929d197202eb934212d00a-merged.mount: Deactivated successfully.
Dec  1 04:14:21 np0005540741 podman[89934]: 2025-12-01 09:14:21.9771448 +0000 UTC m=+3.554212820 container remove 0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:14:22 np0005540741 podman[90146]: 2025-12-01 09:14:22.240154997 +0000 UTC m=+0.086434685 container create b8cc745a821772e0e0fff3e042e1907ba7c258782eed74884fe6924e85cc9329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:14:22 np0005540741 podman[90146]: 2025-12-01 09:14:22.187692118 +0000 UTC m=+0.033971836 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecd993e200d938a106ab467d743f0b19b8be9b6c706b9ccce917fda3605d2911/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecd993e200d938a106ab467d743f0b19b8be9b6c706b9ccce917fda3605d2911/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecd993e200d938a106ab467d743f0b19b8be9b6c706b9ccce917fda3605d2911/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecd993e200d938a106ab467d743f0b19b8be9b6c706b9ccce917fda3605d2911/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:22 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecd993e200d938a106ab467d743f0b19b8be9b6c706b9ccce917fda3605d2911/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:22 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:22 np0005540741 podman[90146]: 2025-12-01 09:14:22.372253414 +0000 UTC m=+0.218533102 container init b8cc745a821772e0e0fff3e042e1907ba7c258782eed74884fe6924e85cc9329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  1 04:14:22 np0005540741 podman[90146]: 2025-12-01 09:14:22.383420065 +0000 UTC m=+0.229699753 container start b8cc745a821772e0e0fff3e042e1907ba7c258782eed74884fe6924e85cc9329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True)
Dec  1 04:14:22 np0005540741 bash[90146]: b8cc745a821772e0e0fff3e042e1907ba7c258782eed74884fe6924e85cc9329
Dec  1 04:14:22 np0005540741 systemd[1]: Started Ceph osd.2 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: pidfile_write: ignore empty --pid-file
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d5919800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d5919800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d5919800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d5919800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d6751800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d6751800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d6751800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d6751800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d6751800 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d5919800 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: load: jerasure load: lrc 
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e12 do_prune osdmap full prune enabled
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e13 e13: 3 total, 1 up, 3 in
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e13: 3 total, 1 up, 3 in
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:22 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:22 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e13 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:14:22 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v39: 1 pgs: 1 unknown; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:14:22 np0005540741 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:14:23 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec  1 04:14:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:23 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:23 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluefs mount
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluefs mount shared_bdev_used = 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: RocksDB version: 7.9.2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Git sha 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Compile date 2025-05-06 23:30:25
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: DB SUMMARY
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: DB Session ID:  L5YAHCCF2C03Q9KM4AHV
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: CURRENT file:  CURRENT
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: IDENTITY file:  IDENTITY
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                         Options.error_if_exists: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.create_if_missing: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                         Options.paranoid_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                                     Options.env: 0x5595d67a3c70
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                                Options.info_log: 0x5595d59a08a0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_file_opening_threads: 16
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                              Options.statistics: (nil)
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.use_fsync: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.max_log_file_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                         Options.allow_fallocate: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.use_direct_reads: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.create_missing_column_families: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                              Options.db_log_dir: 
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                                 Options.wal_dir: db.wal
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.advise_random_on_open: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.write_buffer_manager: 0x5595d68ac460
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                            Options.rate_limiter: (nil)
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.unordered_write: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.row_cache: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                              Options.wal_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.allow_ingest_behind: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.two_write_queues: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.manual_wal_flush: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.wal_compression: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.atomic_flush: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.log_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.allow_data_in_errors: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.db_host_id: __hostname__
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.max_background_jobs: 4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.max_background_compactions: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.max_subcompactions: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.max_open_files: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.bytes_per_sync: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.max_background_flushes: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Compression algorithms supported:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kZSTD supported: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kXpressCompression supported: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kBZip2Compression supported: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kLZ4Compression supported: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kZlibCompression supported: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kSnappyCompression supported: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a02c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a02c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a02c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a02c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a02c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 podman[90330]: 2025-12-01 09:14:23.466277002 +0000 UTC m=+0.104371462 container create 2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hermann, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a02c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a02c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a0240)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a0240)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a0240)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 7e93ea91-2f6f-4eb5-a54b-eec339cd63ea
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580463475148, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580463475381, "job": 1, "event": "recovery_finished"}
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: freelist init
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: freelist _read_cfg
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluefs umount
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:14:23 np0005540741 podman[90330]: 2025-12-01 09:14:23.418476125 +0000 UTC m=+0.056570595 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:23 np0005540741 systemd[1]: Started libpod-conmon-2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3.scope.
Dec  1 04:14:23 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:23 np0005540741 podman[90330]: 2025-12-01 09:14:23.665188066 +0000 UTC m=+0.303282536 container init 2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hermann, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec  1 04:14:23 np0005540741 podman[90330]: 2025-12-01 09:14:23.682811893 +0000 UTC m=+0.320906343 container start 2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hermann, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec  1 04:14:23 np0005540741 flamboyant_hermann[90541]: 167 167
Dec  1 04:14:23 np0005540741 systemd[1]: libpod-2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3.scope: Deactivated successfully.
Dec  1 04:14:23 np0005540741 podman[90330]: 2025-12-01 09:14:23.711220639 +0000 UTC m=+0.349315079 container attach 2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hermann, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:23 np0005540741 podman[90330]: 2025-12-01 09:14:23.712121236 +0000 UTC m=+0.350215686 container died 2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hermann, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True)
Dec  1 04:14:23 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b0b139207923be2aa8c01b05508dc094c69253c7ceb0cf145b4525d40ef8c6d5-merged.mount: Deactivated successfully.
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluefs mount
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluefs mount shared_bdev_used = 4718592
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: RocksDB version: 7.9.2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Git sha 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Compile date 2025-05-06 23:30:25
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: DB SUMMARY
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: DB Session ID:  L5YAHCCF2C03Q9KM4AHU
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: CURRENT file:  CURRENT
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: IDENTITY file:  IDENTITY
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                         Options.error_if_exists: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.create_if_missing: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                         Options.paranoid_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                                     Options.env: 0x5595d6954380
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                                Options.info_log: 0x5595d5997280
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_file_opening_threads: 16
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                              Options.statistics: (nil)
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.use_fsync: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.max_log_file_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                         Options.allow_fallocate: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.use_direct_reads: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.create_missing_column_families: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                              Options.db_log_dir: 
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                                 Options.wal_dir: db.wal
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.advise_random_on_open: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.write_buffer_manager: 0x5595d68ac6e0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                            Options.rate_limiter: (nil)
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.unordered_write: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.row_cache: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                              Options.wal_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.allow_ingest_behind: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.two_write_queues: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.manual_wal_flush: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.wal_compression: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.atomic_flush: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.log_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.allow_data_in_errors: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.db_host_id: __hostname__
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.max_background_jobs: 4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.max_background_compactions: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.max_subcompactions: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.max_open_files: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.bytes_per_sync: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.max_background_flushes: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Compression algorithms supported:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kZSTD supported: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kXpressCompression supported: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kBZip2Compression supported: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kLZ4Compression supported: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kZlibCompression supported: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: #011kSnappyCompression supported: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5973c80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5973c80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5973c80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5973c80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5973c80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5973c80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5973c80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5997840)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5997840)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5997840)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5595d598d090#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:14:23 np0005540741 podman[90330]: 2025-12-01 09:14:23.897366363 +0000 UTC m=+0.535460813 container remove 2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hermann, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:23 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:14:23 np0005540741 systemd[1]: libpod-conmon-2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3.scope: Deactivated successfully.
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 7e93ea91-2f6f-4eb5-a54b-eec339cd63ea
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580463972890, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec  1 04:14:23 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580464007944, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580463, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "7e93ea91-2f6f-4eb5-a54b-eec339cd63ea", "db_session_id": "L5YAHCCF2C03Q9KM4AHU", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580464062807, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580464, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "7e93ea91-2f6f-4eb5-a54b-eec339cd63ea", "db_session_id": "L5YAHCCF2C03Q9KM4AHU", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580464067308, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580464, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "7e93ea91-2f6f-4eb5-a54b-eec339cd63ea", "db_session_id": "L5YAHCCF2C03Q9KM4AHU", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580464068949, "job": 1, "event": "recovery_finished"}
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec  1 04:14:24 np0005540741 podman[90747]: 2025-12-01 09:14:24.161484774 +0000 UTC m=+0.075096720 container create 3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_stonebraker, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5595d5afbc00
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: rocksdb: DB pointer 0x5595d6895a00
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.4 total, 0.4 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.4 total, 0.4 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.4 total, 0.4 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.4 total, 0.4 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 460.80 MB usag
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: _get_class not permitted to load lua
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: _get_class not permitted to load sdk
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: _get_class not permitted to load test_remote_reads
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: osd.2 0 load_pgs
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: osd.2 0 load_pgs opened 0 pgs
Dec  1 04:14:24 np0005540741 ceph-osd[90166]: osd.2 0 log_to_monitors true
Dec  1 04:14:24 np0005540741 podman[90747]: 2025-12-01 09:14:24.116513673 +0000 UTC m=+0.030125659 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:24 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2[90162]: 2025-12-01T09:14:24.176+0000 7f122005e740 -1 osd.2 0 log_to_monitors true
Dec  1 04:14:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0) v1
Dec  1 04:14:24 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec  1 04:14:24 np0005540741 systemd[1]: Started libpod-conmon-3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6.scope.
Dec  1 04:14:24 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec  1 04:14:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:24 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:24 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:24 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3153f023110b042fab684674e4433ee376f42db6be1bffa82ccc6d35957b76/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3153f023110b042fab684674e4433ee376f42db6be1bffa82ccc6d35957b76/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3153f023110b042fab684674e4433ee376f42db6be1bffa82ccc6d35957b76/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3153f023110b042fab684674e4433ee376f42db6be1bffa82ccc6d35957b76/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:24 np0005540741 podman[90747]: 2025-12-01 09:14:24.383852612 +0000 UTC m=+0.297464608 container init 3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_stonebraker, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS)
Dec  1 04:14:24 np0005540741 podman[90747]: 2025-12-01 09:14:24.395789826 +0000 UTC m=+0.309401812 container start 3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_stonebraker, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default)
Dec  1 04:14:24 np0005540741 podman[90747]: 2025-12-01 09:14:24.399725796 +0000 UTC m=+0.313337782 container attach 3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_stonebraker, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:14:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e13 do_prune osdmap full prune enabled
Dec  1 04:14:24 np0005540741 ceph-mon[75031]: from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec  1 04:14:24 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Dec  1 04:14:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e14 e14: 3 total, 1 up, 3 in
Dec  1 04:14:24 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e14: 3 total, 1 up, 3 in
Dec  1 04:14:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-0", "root=default"]} v 0) v1
Dec  1 04:14:24 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Dec  1 04:14:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e14 create-or-move crush item name 'osd.2' initial_weight 0.0195 at location {host=compute-0,root=default}
Dec  1 04:14:24 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v40: 1 pgs: 1 unknown; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec  1 04:14:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:25 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:25 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:25 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec  1 04:14:25 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec  1 04:14:25 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec  1 04:14:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:25 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e14 do_prune osdmap full prune enabled
Dec  1 04:14:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec  1 04:14:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e15 e15: 3 total, 1 up, 3 in
Dec  1 04:14:25 np0005540741 ceph-osd[90166]: osd.2 0 done with init, starting boot process
Dec  1 04:14:25 np0005540741 ceph-osd[90166]: osd.2 0 start_boot
Dec  1 04:14:25 np0005540741 ceph-osd[90166]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec  1 04:14:25 np0005540741 ceph-osd[90166]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec  1 04:14:25 np0005540741 ceph-osd[90166]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec  1 04:14:25 np0005540741 ceph-osd[90166]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec  1 04:14:25 np0005540741 ceph-osd[90166]: osd.2 0  bench count 12288000 bsize 4 KiB
Dec  1 04:14:25 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e15: 3 total, 1 up, 3 in
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]: {
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:        "osd_id": 0,
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:        "type": "bluestore"
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:    },
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:        "osd_id": 1,
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:        "type": "bluestore"
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:    },
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:        "osd_id": 2,
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:        "type": "bluestore"
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]:    }
Dec  1 04:14:25 np0005540741 great_stonebraker[90796]: }
Dec  1 04:14:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:25 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:25 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:26 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec  1 04:14:26 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:26 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:26 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:26 np0005540741 ceph-mon[75031]: from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Dec  1 04:14:26 np0005540741 ceph-mon[75031]: from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Dec  1 04:14:26 np0005540741 systemd[1]: libpod-3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6.scope: Deactivated successfully.
Dec  1 04:14:26 np0005540741 systemd[1]: libpod-3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6.scope: Consumed 1.634s CPU time.
Dec  1 04:14:26 np0005540741 podman[90747]: 2025-12-01 09:14:26.023669266 +0000 UTC m=+1.937281252 container died 3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_stonebraker, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:14:26 np0005540741 ceph-osd[89052]: osd.1 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 10.849 iops: 2777.442 elapsed_sec: 1.080
Dec  1 04:14:26 np0005540741 ceph-osd[89052]: log_channel(cluster) log [WRN] : OSD bench result of 2777.441874 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec  1 04:14:26 np0005540741 ceph-osd[89052]: osd.1 0 waiting for initial osdmap
Dec  1 04:14:26 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1[89048]: 2025-12-01T09:14:26.184+0000 7f11a1d9b640 -1 osd.1 0 waiting for initial osdmap
Dec  1 04:14:26 np0005540741 systemd[1]: var-lib-containers-storage-overlay-ca3153f023110b042fab684674e4433ee376f42db6be1bffa82ccc6d35957b76-merged.mount: Deactivated successfully.
Dec  1 04:14:26 np0005540741 ceph-osd[89052]: osd.1 15 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec  1 04:14:26 np0005540741 ceph-osd[89052]: osd.1 15 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Dec  1 04:14:26 np0005540741 ceph-osd[89052]: osd.1 15 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec  1 04:14:26 np0005540741 ceph-osd[89052]: osd.1 15 check_osdmap_features require_osd_release unknown -> reef
Dec  1 04:14:26 np0005540741 ceph-osd[89052]: osd.1 15 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec  1 04:14:26 np0005540741 ceph-osd[89052]: osd.1 15 set_numa_affinity not setting numa affinity
Dec  1 04:14:26 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1[89048]: 2025-12-01T09:14:26.240+0000 7f119d3c3640 -1 osd.1 15 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec  1 04:14:26 np0005540741 ceph-osd[89052]: osd.1 15 _collect_metadata loop4:  no unique device id for loop4: fallback method has no model nor serial
Dec  1 04:14:26 np0005540741 podman[90747]: 2025-12-01 09:14:26.266601191 +0000 UTC m=+2.180213137 container remove 3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_stonebraker, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec  1 04:14:26 np0005540741 systemd[1]: libpod-conmon-3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6.scope: Deactivated successfully.
Dec  1 04:14:26 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec  1 04:14:26 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:26 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:26 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec  1 04:14:26 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:14:26 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:26 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:14:26 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:26 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v43: 1 pgs: 1 unknown; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec  1 04:14:27 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec  1 04:14:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e15 do_prune osdmap full prune enabled
Dec  1 04:14:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:27 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e16 e16: 3 total, 2 up, 3 in
Dec  1 04:14:27 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857] boot
Dec  1 04:14:27 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e16: 3 total, 2 up, 3 in
Dec  1 04:14:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec  1 04:14:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec  1 04:14:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:27 np0005540741 ceph-osd[89052]: osd.1 16 state: booting -> active
Dec  1 04:14:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 16 pg[1.0( empty local-lis/les=0/0 n=0 ec=12/12 lis/c=0/0 les/c/f=0/0/0 sis=16) [1] r=0 lpr=16 pi=[12,16)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:14:27 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:27 np0005540741 ceph-mon[75031]: from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec  1 04:14:27 np0005540741 ceph-mon[75031]: OSD bench result of 2777.441874 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec  1 04:14:27 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:27 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:27 np0005540741 podman[91066]: 2025-12-01 09:14:27.473922443 +0000 UTC m=+0.141761562 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec  1 04:14:27 np0005540741 podman[91066]: 2025-12-01 09:14:27.642830582 +0000 UTC m=+0.310669681 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec  1 04:14:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e16 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:14:28 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:28 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e16 do_prune osdmap full prune enabled
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857] boot
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e17 e17: 3 total, 2 up, 3 in
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e17: 3 total, 2 up, 3 in
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:28 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 17 pg[1.0( empty local-lis/les=16/17 n=0 ec=12/12 lis/c=0/0 les/c/f=0/0/0 sis=16) [1] r=0 lpr=16 pi=[12,16)/0 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:14:28 np0005540741 ceph-mgr[75324]: [devicehealth INFO root] creating main.db for devicehealth
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:28 np0005540741 ceph-mgr[75324]: [devicehealth INFO root] Check health
Dec  1 04:14:28 np0005540741 ceph-mgr[75324]: [devicehealth ERROR root] Fail to parse JSON result from daemon osd.2 ()
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon metadata", "id": "compute-0"} v 0) v1
Dec  1 04:14:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec  1 04:14:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v46: 1 pgs: 1 creating+peering; 0 B data, 853 MiB used, 39 GiB / 40 GiB avail
Dec  1 04:14:29 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec  1 04:14:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:29 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:29 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e17 do_prune osdmap full prune enabled
Dec  1 04:14:29 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:29 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:29 np0005540741 ceph-mon[75031]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Dec  1 04:14:29 np0005540741 ceph-mon[75031]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Dec  1 04:14:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e18 e18: 3 total, 2 up, 3 in
Dec  1 04:14:29 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e18: 3 total, 2 up, 3 in
Dec  1 04:14:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:29 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:29 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:30 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec  1 04:14:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:30 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:30 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:30 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e9: compute-0.psduho(active, since 77s)
Dec  1 04:14:30 np0005540741 podman[91472]: 2025-12-01 09:14:30.899959525 +0000 UTC m=+0.056476782 container create 2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_raman, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec  1 04:14:30 np0005540741 podman[91472]: 2025-12-01 09:14:30.874434597 +0000 UTC m=+0.030951854 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:30 np0005540741 systemd[1]: Started libpod-conmon-2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6.scope.
Dec  1 04:14:31 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v48: 1 pgs: 1 creating+peering; 0 B data, 453 MiB used, 40 GiB / 40 GiB avail
Dec  1 04:14:31 np0005540741 podman[91472]: 2025-12-01 09:14:31.10484088 +0000 UTC m=+0.261358147 container init 2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_raman, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:14:31 np0005540741 podman[91472]: 2025-12-01 09:14:31.115453204 +0000 UTC m=+0.271970441 container start 2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_raman, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:31 np0005540741 exciting_raman[91489]: 167 167
Dec  1 04:14:31 np0005540741 systemd[1]: libpod-2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6.scope: Deactivated successfully.
Dec  1 04:14:31 np0005540741 podman[91472]: 2025-12-01 09:14:31.151041399 +0000 UTC m=+0.307558656 container attach 2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_raman, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:31 np0005540741 podman[91472]: 2025-12-01 09:14:31.152978648 +0000 UTC m=+0.309495905 container died 2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_raman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:14:31 np0005540741 systemd[1]: var-lib-containers-storage-overlay-9b646604e537e42241c5e222822e81cc09c9b2d587e58283103bef290df90daa-merged.mount: Deactivated successfully.
Dec  1 04:14:31 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec  1 04:14:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:31 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:31 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:31 np0005540741 podman[91472]: 2025-12-01 09:14:31.288678323 +0000 UTC m=+0.445195560 container remove 2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_raman, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:31 np0005540741 systemd[1]: libpod-conmon-2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6.scope: Deactivated successfully.
Dec  1 04:14:31 np0005540741 podman[91513]: 2025-12-01 09:14:31.603464738 +0000 UTC m=+0.048835698 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:31 np0005540741 podman[91513]: 2025-12-01 09:14:31.718920498 +0000 UTC m=+0.164291428 container create ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_edison, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec  1 04:14:31 np0005540741 systemd[1]: Started libpod-conmon-ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620.scope.
Dec  1 04:14:31 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd7781993f4251c8648ab23bae0fc2a803ce88acc85341d052ed407a11cbb377/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd7781993f4251c8648ab23bae0fc2a803ce88acc85341d052ed407a11cbb377/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd7781993f4251c8648ab23bae0fc2a803ce88acc85341d052ed407a11cbb377/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd7781993f4251c8648ab23bae0fc2a803ce88acc85341d052ed407a11cbb377/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:31 np0005540741 podman[91513]: 2025-12-01 09:14:31.833000455 +0000 UTC m=+0.278371405 container init ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_edison, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 04:14:31 np0005540741 podman[91513]: 2025-12-01 09:14:31.840420031 +0000 UTC m=+0.285790961 container start ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_edison, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec  1 04:14:31 np0005540741 podman[91513]: 2025-12-01 09:14:31.858468532 +0000 UTC m=+0.303839552 container attach ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_edison, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec  1 04:14:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:32 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec  1 04:14:32 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e18 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:14:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 453 MiB used, 40 GiB / 40 GiB avail
Dec  1 04:14:33 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec  1 04:14:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:33 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:33 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:33 np0005540741 frosty_edison[91530]: [
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:    {
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:        "available": false,
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:        "ceph_device": false,
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:        "device_id": "QEMU_DVD-ROM_QM00001",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:        "lsm_data": {},
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:        "lvs": [],
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:        "path": "/dev/sr0",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:        "rejected_reasons": [
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "Has a FileSystem",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "Insufficient space (<5GB)"
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:        ],
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:        "sys_api": {
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "actuators": null,
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "device_nodes": "sr0",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "devname": "sr0",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "human_readable_size": "482.00 KB",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "id_bus": "ata",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "model": "QEMU DVD-ROM",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "nr_requests": "2",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "parent": "/dev/sr0",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "partitions": {},
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "path": "/dev/sr0",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "removable": "1",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "rev": "2.5+",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "ro": "0",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "rotational": "1",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "sas_address": "",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "sas_device_handle": "",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "scheduler_mode": "mq-deadline",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "sectors": 0,
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "sectorsize": "2048",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "size": 493568.0,
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "support_discard": "2048",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "type": "disk",
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:            "vendor": "QEMU"
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:        }
Dec  1 04:14:33 np0005540741 frosty_edison[91530]:    }
Dec  1 04:14:33 np0005540741 frosty_edison[91530]: ]
Dec  1 04:14:33 np0005540741 ceph-osd[90166]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 16.769 iops: 4292.904 elapsed_sec: 0.699
Dec  1 04:14:33 np0005540741 ceph-osd[90166]: log_channel(cluster) log [WRN] : OSD bench result of 4292.903722 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec  1 04:14:33 np0005540741 ceph-osd[90166]: osd.2 0 waiting for initial osdmap
Dec  1 04:14:33 np0005540741 systemd[1]: libpod-ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620.scope: Deactivated successfully.
Dec  1 04:14:33 np0005540741 systemd[1]: libpod-ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620.scope: Consumed 2.074s CPU time.
Dec  1 04:14:33 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2[90162]: 2025-12-01T09:14:33.870+0000 7f121c7f5640 -1 osd.2 0 waiting for initial osdmap
Dec  1 04:14:33 np0005540741 podman[91513]: 2025-12-01 09:14:33.874407081 +0000 UTC m=+2.319778011 container died ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_edison, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Dec  1 04:14:33 np0005540741 ceph-osd[90166]: osd.2 18 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec  1 04:14:33 np0005540741 ceph-osd[90166]: osd.2 18 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Dec  1 04:14:33 np0005540741 ceph-osd[90166]: osd.2 18 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec  1 04:14:33 np0005540741 ceph-osd[90166]: osd.2 18 check_osdmap_features require_osd_release unknown -> reef
Dec  1 04:14:33 np0005540741 ceph-osd[90166]: osd.2 18 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec  1 04:14:33 np0005540741 ceph-osd[90166]: osd.2 18 set_numa_affinity not setting numa affinity
Dec  1 04:14:33 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2[90162]: 2025-12-01T09:14:33.902+0000 7f1217606640 -1 osd.2 18 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec  1 04:14:33 np0005540741 ceph-osd[90166]: osd.2 18 _collect_metadata loop5:  no unique device id for loop5: fallback method has no model nor serial
Dec  1 04:14:33 np0005540741 systemd[1]: var-lib-containers-storage-overlay-dd7781993f4251c8648ab23bae0fc2a803ce88acc85341d052ed407a11cbb377-merged.mount: Deactivated successfully.
Dec  1 04:14:33 np0005540741 podman[91513]: 2025-12-01 09:14:33.958825645 +0000 UTC m=+2.404196575 container remove ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_edison, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  1 04:14:33 np0005540741 systemd[1]: libpod-conmon-ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620.scope: Deactivated successfully.
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) v1
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) v1
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) v1
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec  1 04:14:34 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Adjusting osd_memory_target on compute-0 to 43690k
Dec  1 04:14:34 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on compute-0 to 43690k
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) v1
Dec  1 04:14:34 np0005540741 ceph-mgr[75324]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on compute-0 to 44739242: error parsing value: Value '44739242' is below minimum 939524096
Dec  1 04:14:34 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on compute-0 to 44739242: error parsing value: Value '44739242' is below minimum 939524096
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:34 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 45e0131a-7f24-4bba-8ffd-dc6331f7ff73 does not exist
Dec  1 04:14:34 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 8365c79d-8bb7-4c64-a07e-6d2e276939d2 does not exist
Dec  1 04:14:34 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev fc0e2293-dca2-418f-be53-e200727213c6 does not exist
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:14:34 np0005540741 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:34 np0005540741 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec  1 04:14:34 np0005540741 podman[93313]: 2025-12-01 09:14:34.781825591 +0000 UTC m=+0.042454595 container create 48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banzai, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e18 do_prune osdmap full prune enabled
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: OSD bench result of 4292.903722 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:14:34 np0005540741 systemd[1]: Started libpod-conmon-48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5.scope.
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e19 e19: 3 total, 3 up, 3 in
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466] boot
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e19: 3 total, 3 up, 3 in
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec  1 04:14:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec  1 04:14:34 np0005540741 ceph-osd[90166]: osd.2 19 state: booting -> active
Dec  1 04:14:34 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:34 np0005540741 podman[93313]: 2025-12-01 09:14:34.761043798 +0000 UTC m=+0.021672792 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:34 np0005540741 podman[93313]: 2025-12-01 09:14:34.86017178 +0000 UTC m=+0.120800764 container init 48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banzai, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Dec  1 04:14:34 np0005540741 podman[93313]: 2025-12-01 09:14:34.872613259 +0000 UTC m=+0.133242223 container start 48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banzai, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Dec  1 04:14:34 np0005540741 podman[93313]: 2025-12-01 09:14:34.876664142 +0000 UTC m=+0.137293106 container attach 48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banzai, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec  1 04:14:34 np0005540741 lucid_banzai[93330]: 167 167
Dec  1 04:14:34 np0005540741 systemd[1]: libpod-48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5.scope: Deactivated successfully.
Dec  1 04:14:34 np0005540741 podman[93313]: 2025-12-01 09:14:34.879715735 +0000 UTC m=+0.140344699 container died 48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banzai, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Dec  1 04:14:34 np0005540741 systemd[1]: var-lib-containers-storage-overlay-26b7bf6ff0c6be4409adb10fb6ebf9fd57a8c05e6ecd10a5bb73d02c3a8501cc-merged.mount: Deactivated successfully.
Dec  1 04:14:34 np0005540741 podman[93313]: 2025-12-01 09:14:34.913523956 +0000 UTC m=+0.174152920 container remove 48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banzai, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:34 np0005540741 systemd[1]: libpod-conmon-48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5.scope: Deactivated successfully.
Dec  1 04:14:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v51: 1 pgs: 1 active+clean; 449 KiB data, 453 MiB used, 40 GiB / 40 GiB avail
Dec  1 04:14:35 np0005540741 podman[93355]: 2025-12-01 09:14:35.064004983 +0000 UTC m=+0.033524753 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:35 np0005540741 podman[93355]: 2025-12-01 09:14:35.429074079 +0000 UTC m=+0.398593859 container create 345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_zhukovsky, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef)
Dec  1 04:14:35 np0005540741 systemd[1]: Started libpod-conmon-345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a.scope.
Dec  1 04:14:35 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:35 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ec28d90325cfbfc44affa0ec33d64c9e2854cbcdf2ea17b7f776acbc5be2361/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:35 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ec28d90325cfbfc44affa0ec33d64c9e2854cbcdf2ea17b7f776acbc5be2361/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:35 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ec28d90325cfbfc44affa0ec33d64c9e2854cbcdf2ea17b7f776acbc5be2361/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:35 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ec28d90325cfbfc44affa0ec33d64c9e2854cbcdf2ea17b7f776acbc5be2361/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:35 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ec28d90325cfbfc44affa0ec33d64c9e2854cbcdf2ea17b7f776acbc5be2361/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:35 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e19 do_prune osdmap full prune enabled
Dec  1 04:14:35 np0005540741 ceph-mon[75031]: Adjusting osd_memory_target on compute-0 to 43690k
Dec  1 04:14:35 np0005540741 ceph-mon[75031]: Unable to set osd_memory_target on compute-0 to 44739242: error parsing value: Value '44739242' is below minimum 939524096
Dec  1 04:14:35 np0005540741 ceph-mon[75031]: osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466] boot
Dec  1 04:14:35 np0005540741 podman[93355]: 2025-12-01 09:14:35.898599092 +0000 UTC m=+0.868118942 container init 345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_zhukovsky, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec  1 04:14:35 np0005540741 podman[93355]: 2025-12-01 09:14:35.909879836 +0000 UTC m=+0.879399616 container start 345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_zhukovsky, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:14:36 np0005540741 podman[93355]: 2025-12-01 09:14:36.16820844 +0000 UTC m=+1.137728210 container attach 345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_zhukovsky, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507)
Dec  1 04:14:36 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e20 e20: 3 total, 3 up, 3 in
Dec  1 04:14:36 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e20: 3 total, 3 up, 3 in
Dec  1 04:14:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v53: 1 pgs: 1 active+clean; 449 KiB data, 880 MiB used, 59 GiB / 60 GiB avail
Dec  1 04:14:37 np0005540741 festive_zhukovsky[93372]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:14:37 np0005540741 festive_zhukovsky[93372]: --> relative data size: 1.0
Dec  1 04:14:37 np0005540741 festive_zhukovsky[93372]: --> All data devices are unavailable
Dec  1 04:14:37 np0005540741 systemd[1]: libpod-345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a.scope: Deactivated successfully.
Dec  1 04:14:37 np0005540741 systemd[1]: libpod-345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a.scope: Consumed 1.237s CPU time.
Dec  1 04:14:37 np0005540741 podman[93355]: 2025-12-01 09:14:37.197403203 +0000 UTC m=+2.166922973 container died 345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_zhukovsky, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec  1 04:14:37 np0005540741 systemd[1]: var-lib-containers-storage-overlay-8ec28d90325cfbfc44affa0ec33d64c9e2854cbcdf2ea17b7f776acbc5be2361-merged.mount: Deactivated successfully.
Dec  1 04:14:37 np0005540741 podman[93355]: 2025-12-01 09:14:37.260549287 +0000 UTC m=+2.230069057 container remove 345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_zhukovsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  1 04:14:37 np0005540741 systemd[1]: libpod-conmon-345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a.scope: Deactivated successfully.
Dec  1 04:14:37 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e20 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:14:37 np0005540741 podman[93552]: 2025-12-01 09:14:37.97481632 +0000 UTC m=+0.041046822 container create 5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_shaw, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec  1 04:14:38 np0005540741 systemd[1]: Started libpod-conmon-5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286.scope.
Dec  1 04:14:38 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:38 np0005540741 podman[93552]: 2025-12-01 09:14:37.957151181 +0000 UTC m=+0.023381703 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:38 np0005540741 podman[93552]: 2025-12-01 09:14:38.079890883 +0000 UTC m=+0.146121405 container init 5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_shaw, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:38 np0005540741 podman[93552]: 2025-12-01 09:14:38.092837258 +0000 UTC m=+0.159067770 container start 5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_shaw, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec  1 04:14:38 np0005540741 podman[93552]: 2025-12-01 09:14:38.097164329 +0000 UTC m=+0.163394871 container attach 5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_shaw, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec  1 04:14:38 np0005540741 systemd[1]: libpod-5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286.scope: Deactivated successfully.
Dec  1 04:14:38 np0005540741 strange_shaw[93569]: 167 167
Dec  1 04:14:38 np0005540741 podman[93552]: 2025-12-01 09:14:38.101829842 +0000 UTC m=+0.168060374 container died 5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_shaw, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:38 np0005540741 conmon[93569]: conmon 5b298c80e1b4c7f21471 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286.scope/container/memory.events
Dec  1 04:14:38 np0005540741 systemd[1]: var-lib-containers-storage-overlay-64098f3859646499d6812202bdf0b27619c95358a26a447d3b385b5f776aff38-merged.mount: Deactivated successfully.
Dec  1 04:14:38 np0005540741 podman[93552]: 2025-12-01 09:14:38.167430331 +0000 UTC m=+0.233660833 container remove 5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_shaw, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:38 np0005540741 systemd[1]: libpod-conmon-5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286.scope: Deactivated successfully.
Dec  1 04:14:38 np0005540741 podman[93591]: 2025-12-01 09:14:38.366994165 +0000 UTC m=+0.058954328 container create bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_wu, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:38 np0005540741 systemd[1]: Started libpod-conmon-bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65.scope.
Dec  1 04:14:38 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11e2c741073363099782bf8de6d1feed77cb185a8914f94075e5f9cbc5a38d09/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11e2c741073363099782bf8de6d1feed77cb185a8914f94075e5f9cbc5a38d09/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11e2c741073363099782bf8de6d1feed77cb185a8914f94075e5f9cbc5a38d09/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11e2c741073363099782bf8de6d1feed77cb185a8914f94075e5f9cbc5a38d09/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:38 np0005540741 podman[93591]: 2025-12-01 09:14:38.349192982 +0000 UTC m=+0.041153155 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:38 np0005540741 podman[93591]: 2025-12-01 09:14:38.470920943 +0000 UTC m=+0.162881176 container init bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_wu, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:38 np0005540741 podman[93591]: 2025-12-01 09:14:38.479400491 +0000 UTC m=+0.171360634 container start bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_wu, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:38 np0005540741 podman[93591]: 2025-12-01 09:14:38.483435454 +0000 UTC m=+0.175395617 container attach bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_wu, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec  1 04:14:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v54: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:14:39 np0005540741 loving_wu[93608]: {
Dec  1 04:14:39 np0005540741 loving_wu[93608]:    "0": [
Dec  1 04:14:39 np0005540741 loving_wu[93608]:        {
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "devices": [
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "/dev/loop3"
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            ],
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "lv_name": "ceph_lv0",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "lv_size": "21470642176",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "name": "ceph_lv0",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "tags": {
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.cluster_name": "ceph",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.crush_device_class": "",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.encrypted": "0",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.osd_id": "0",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.type": "block",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.vdo": "0"
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            },
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "type": "block",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "vg_name": "ceph_vg0"
Dec  1 04:14:39 np0005540741 loving_wu[93608]:        }
Dec  1 04:14:39 np0005540741 loving_wu[93608]:    ],
Dec  1 04:14:39 np0005540741 loving_wu[93608]:    "1": [
Dec  1 04:14:39 np0005540741 loving_wu[93608]:        {
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "devices": [
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "/dev/loop4"
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            ],
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "lv_name": "ceph_lv1",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "lv_size": "21470642176",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "name": "ceph_lv1",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "tags": {
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.cluster_name": "ceph",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.crush_device_class": "",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.encrypted": "0",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.osd_id": "1",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.type": "block",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.vdo": "0"
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            },
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "type": "block",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "vg_name": "ceph_vg1"
Dec  1 04:14:39 np0005540741 loving_wu[93608]:        }
Dec  1 04:14:39 np0005540741 loving_wu[93608]:    ],
Dec  1 04:14:39 np0005540741 loving_wu[93608]:    "2": [
Dec  1 04:14:39 np0005540741 loving_wu[93608]:        {
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "devices": [
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "/dev/loop5"
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            ],
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "lv_name": "ceph_lv2",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "lv_size": "21470642176",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "name": "ceph_lv2",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "tags": {
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.cluster_name": "ceph",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.crush_device_class": "",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.encrypted": "0",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.osd_id": "2",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.type": "block",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:                "ceph.vdo": "0"
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            },
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "type": "block",
Dec  1 04:14:39 np0005540741 loving_wu[93608]:            "vg_name": "ceph_vg2"
Dec  1 04:14:39 np0005540741 loving_wu[93608]:        }
Dec  1 04:14:39 np0005540741 loving_wu[93608]:    ]
Dec  1 04:14:39 np0005540741 loving_wu[93608]: }
Dec  1 04:14:39 np0005540741 systemd[1]: libpod-bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65.scope: Deactivated successfully.
Dec  1 04:14:39 np0005540741 podman[93591]: 2025-12-01 09:14:39.301371256 +0000 UTC m=+0.993331449 container died bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_wu, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec  1 04:14:39 np0005540741 systemd[1]: var-lib-containers-storage-overlay-11e2c741073363099782bf8de6d1feed77cb185a8914f94075e5f9cbc5a38d09-merged.mount: Deactivated successfully.
Dec  1 04:14:39 np0005540741 podman[93591]: 2025-12-01 09:14:39.362473048 +0000 UTC m=+1.054433191 container remove bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_wu, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:39 np0005540741 systemd[1]: libpod-conmon-bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65.scope: Deactivated successfully.
Dec  1 04:14:39 np0005540741 podman[93771]: 2025-12-01 09:14:39.997079012 +0000 UTC m=+0.045468917 container create 3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_lichterman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Dec  1 04:14:40 np0005540741 systemd[1]: Started libpod-conmon-3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45.scope.
Dec  1 04:14:40 np0005540741 podman[93771]: 2025-12-01 09:14:39.975576576 +0000 UTC m=+0.023966501 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:40 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:40 np0005540741 podman[93771]: 2025-12-01 09:14:40.150660333 +0000 UTC m=+0.199050258 container init 3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_lichterman, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:14:40 np0005540741 podman[93771]: 2025-12-01 09:14:40.157911974 +0000 UTC m=+0.206301879 container start 3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_lichterman, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:14:40 np0005540741 modest_lichterman[93787]: 167 167
Dec  1 04:14:40 np0005540741 systemd[1]: libpod-3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45.scope: Deactivated successfully.
Dec  1 04:14:40 np0005540741 podman[93771]: 2025-12-01 09:14:40.181278827 +0000 UTC m=+0.229668752 container attach 3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_lichterman, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Dec  1 04:14:40 np0005540741 podman[93771]: 2025-12-01 09:14:40.182754091 +0000 UTC m=+0.231143996 container died 3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_lichterman, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec  1 04:14:40 np0005540741 systemd[1]: var-lib-containers-storage-overlay-dc9d557c3e7ab3a9c1a53ad24f861f03eafa01e3cb0784eb63335d486fe5ee15-merged.mount: Deactivated successfully.
Dec  1 04:14:40 np0005540741 podman[93771]: 2025-12-01 09:14:40.235578352 +0000 UTC m=+0.283968267 container remove 3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_lichterman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  1 04:14:40 np0005540741 systemd[1]: libpod-conmon-3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45.scope: Deactivated successfully.
Dec  1 04:14:40 np0005540741 podman[93810]: 2025-12-01 09:14:40.407275345 +0000 UTC m=+0.043771855 container create 7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_mirzakhani, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:40 np0005540741 systemd[1]: Started libpod-conmon-7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609.scope.
Dec  1 04:14:40 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:40 np0005540741 podman[93810]: 2025-12-01 09:14:40.386966626 +0000 UTC m=+0.023463166 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:40 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d49931cd36ff1158e6f37e52c4a9f1cb91440fd5272e29551561448793d19eba/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:40 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d49931cd36ff1158e6f37e52c4a9f1cb91440fd5272e29551561448793d19eba/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:40 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d49931cd36ff1158e6f37e52c4a9f1cb91440fd5272e29551561448793d19eba/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:40 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d49931cd36ff1158e6f37e52c4a9f1cb91440fd5272e29551561448793d19eba/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:40 np0005540741 podman[93810]: 2025-12-01 09:14:40.509434399 +0000 UTC m=+0.145930949 container init 7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_mirzakhani, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:40 np0005540741 podman[93810]: 2025-12-01 09:14:40.518144725 +0000 UTC m=+0.154641245 container start 7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_mirzakhani, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:40 np0005540741 podman[93810]: 2025-12-01 09:14:40.521636511 +0000 UTC m=+0.158133031 container attach 7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_mirzakhani, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec  1 04:14:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v55: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]: {
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:        "osd_id": 0,
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:        "type": "bluestore"
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:    },
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:        "osd_id": 1,
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:        "type": "bluestore"
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:    },
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:        "osd_id": 2,
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:        "type": "bluestore"
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]:    }
Dec  1 04:14:41 np0005540741 sleepy_mirzakhani[93826]: }
Dec  1 04:14:41 np0005540741 systemd[1]: libpod-7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609.scope: Deactivated successfully.
Dec  1 04:14:41 np0005540741 systemd[1]: libpod-7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609.scope: Consumed 1.106s CPU time.
Dec  1 04:14:41 np0005540741 podman[93859]: 2025-12-01 09:14:41.653219773 +0000 UTC m=+0.027654264 container died 7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_mirzakhani, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:41 np0005540741 systemd[1]: var-lib-containers-storage-overlay-d49931cd36ff1158e6f37e52c4a9f1cb91440fd5272e29551561448793d19eba-merged.mount: Deactivated successfully.
Dec  1 04:14:41 np0005540741 podman[93859]: 2025-12-01 09:14:41.721759342 +0000 UTC m=+0.096193823 container remove 7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_mirzakhani, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec  1 04:14:41 np0005540741 systemd[1]: libpod-conmon-7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609.scope: Deactivated successfully.
Dec  1 04:14:41 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:14:41 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:41 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:14:41 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:41 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/alertmanager/web_user}] v 0) v1
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/alertmanager/web_password}] v 0) v1
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/prometheus/web_user}] v 0) v1
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/prometheus/web_password}] v 0) v1
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:42 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Reconfiguring mon.compute-0 (unknown last config time)...
Dec  1 04:14:42 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Reconfiguring mon.compute-0 (unknown last config time)...
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) v1
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) v1
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:14:42 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.compute-0 on compute-0
Dec  1 04:14:42 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.compute-0 on compute-0
Dec  1 04:14:42 np0005540741 podman[94038]: 2025-12-01 09:14:42.556132885 +0000 UTC m=+0.056305868 container create 56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_archimedes, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec  1 04:14:42 np0005540741 systemd[1]: Started libpod-conmon-56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2.scope.
Dec  1 04:14:42 np0005540741 podman[94038]: 2025-12-01 09:14:42.526108089 +0000 UTC m=+0.026281072 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:42 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:42 np0005540741 podman[94038]: 2025-12-01 09:14:42.651742429 +0000 UTC m=+0.151915412 container init 56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_archimedes, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec  1 04:14:42 np0005540741 podman[94038]: 2025-12-01 09:14:42.659958819 +0000 UTC m=+0.160131792 container start 56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_archimedes, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:42 np0005540741 podman[94038]: 2025-12-01 09:14:42.663767706 +0000 UTC m=+0.163940679 container attach 56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_archimedes, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True)
Dec  1 04:14:42 np0005540741 nervous_archimedes[94054]: 167 167
Dec  1 04:14:42 np0005540741 systemd[1]: libpod-56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2.scope: Deactivated successfully.
Dec  1 04:14:42 np0005540741 podman[94038]: 2025-12-01 09:14:42.666730666 +0000 UTC m=+0.166903639 container died 56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_archimedes, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:42 np0005540741 systemd[1]: var-lib-containers-storage-overlay-f0fbe01d42b978c2b801d704e3e62cdc60be826547d475f066d6a794afbb7220-merged.mount: Deactivated successfully.
Dec  1 04:14:42 np0005540741 podman[94038]: 2025-12-01 09:14:42.70556851 +0000 UTC m=+0.205741483 container remove 56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_archimedes, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:42 np0005540741 systemd[1]: libpod-conmon-56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2.scope: Deactivated successfully.
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:42 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Reconfiguring mgr.compute-0.psduho (unknown last config time)...
Dec  1 04:14:42 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Reconfiguring mgr.compute-0.psduho (unknown last config time)...
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.compute-0.psduho", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) v1
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.psduho", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:14:42 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.compute-0.psduho on compute-0
Dec  1 04:14:42 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.compute-0.psduho on compute-0
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.psduho", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  1 04:14:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e20 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:14:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:14:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:14:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:14:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:14:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:14:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:14:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:14:43 np0005540741 podman[94190]: 2025-12-01 09:14:43.391043175 +0000 UTC m=+0.042971021 container create 78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_bhabha, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:43 np0005540741 systemd[1]: Started libpod-conmon-78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306.scope.
Dec  1 04:14:43 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:43 np0005540741 podman[94190]: 2025-12-01 09:14:43.467195726 +0000 UTC m=+0.119123572 container init 78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_bhabha, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS)
Dec  1 04:14:43 np0005540741 podman[94190]: 2025-12-01 09:14:43.370640233 +0000 UTC m=+0.022568079 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:43 np0005540741 podman[94190]: 2025-12-01 09:14:43.472641642 +0000 UTC m=+0.124569468 container start 78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_bhabha, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:43 np0005540741 podman[94190]: 2025-12-01 09:14:43.476168819 +0000 UTC m=+0.128096695 container attach 78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_bhabha, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec  1 04:14:43 np0005540741 great_bhabha[94206]: 167 167
Dec  1 04:14:43 np0005540741 systemd[1]: libpod-78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306.scope: Deactivated successfully.
Dec  1 04:14:43 np0005540741 podman[94190]: 2025-12-01 09:14:43.477990245 +0000 UTC m=+0.129918061 container died 78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_bhabha, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Dec  1 04:14:43 np0005540741 systemd[1]: var-lib-containers-storage-overlay-15a56f30a33de5aebfb742a67a74e3d246371ca9202990041651596b4aaa7310-merged.mount: Deactivated successfully.
Dec  1 04:14:43 np0005540741 podman[94190]: 2025-12-01 09:14:43.51359117 +0000 UTC m=+0.165518996 container remove 78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_bhabha, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:14:43 np0005540741 systemd[1]: libpod-conmon-78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306.scope: Deactivated successfully.
Dec  1 04:14:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:14:43 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:14:43 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:43 np0005540741 ceph-mon[75031]: Reconfiguring mon.compute-0 (unknown last config time)...
Dec  1 04:14:43 np0005540741 ceph-mon[75031]: Reconfiguring daemon mon.compute-0 on compute-0
Dec  1 04:14:43 np0005540741 ceph-mon[75031]: Reconfiguring mgr.compute-0.psduho (unknown last config time)...
Dec  1 04:14:43 np0005540741 ceph-mon[75031]: Reconfiguring daemon mgr.compute-0.psduho on compute-0
Dec  1 04:14:43 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:43 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:44 np0005540741 podman[94397]: 2025-12-01 09:14:44.404056923 +0000 UTC m=+0.058150093 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:14:44 np0005540741 podman[94397]: 2025-12-01 09:14:44.511709785 +0000 UTC m=+0.165802925 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:14:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:14:45 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:14:45 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:14:45 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:14:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:14:45 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:14:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:14:45 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:14:45 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 6aaf94fd-d6b5-4da8-a92a-420f856e4c4a does not exist
Dec  1 04:14:45 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 12544f78-4fd1-42ca-a06c-5f8eefcb686f does not exist
Dec  1 04:14:45 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 5b1226f3-a7e3-4d6f-b4fa-3ba594acd496 does not exist
Dec  1 04:14:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:14:45 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:14:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:14:45 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:14:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:14:45 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:14:45 np0005540741 podman[94659]: 2025-12-01 09:14:45.628727314 +0000 UTC m=+0.054011908 container create 73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_hofstadter, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Dec  1 04:14:45 np0005540741 systemd[1]: Started libpod-conmon-73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c.scope.
Dec  1 04:14:45 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:45 np0005540741 podman[94659]: 2025-12-01 09:14:45.602610368 +0000 UTC m=+0.027895012 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:45 np0005540741 podman[94659]: 2025-12-01 09:14:45.709842936 +0000 UTC m=+0.135127600 container init 73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_hofstadter, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  1 04:14:45 np0005540741 podman[94659]: 2025-12-01 09:14:45.720145671 +0000 UTC m=+0.145430255 container start 73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_hofstadter, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:14:45 np0005540741 agitated_hofstadter[94675]: 167 167
Dec  1 04:14:45 np0005540741 systemd[1]: libpod-73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c.scope: Deactivated successfully.
Dec  1 04:14:45 np0005540741 podman[94659]: 2025-12-01 09:14:45.724224335 +0000 UTC m=+0.149508949 container attach 73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_hofstadter, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec  1 04:14:45 np0005540741 podman[94659]: 2025-12-01 09:14:45.724669718 +0000 UTC m=+0.149954302 container died 73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_hofstadter, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:14:45 np0005540741 systemd[1]: var-lib-containers-storage-overlay-45270ed9d4e0f009d5cf4fad28871f7dad7f625def583b3653bf8e324096edf9-merged.mount: Deactivated successfully.
Dec  1 04:14:45 np0005540741 podman[94659]: 2025-12-01 09:14:45.764566795 +0000 UTC m=+0.189851399 container remove 73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_hofstadter, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:14:45 np0005540741 systemd[1]: libpod-conmon-73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c.scope: Deactivated successfully.
Dec  1 04:14:45 np0005540741 podman[94698]: 2025-12-01 09:14:45.994474811 +0000 UTC m=+0.115209971 container create c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_curran, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec  1 04:14:46 np0005540741 podman[94698]: 2025-12-01 09:14:45.905067707 +0000 UTC m=+0.025802857 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:46 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:46 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:46 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:14:46 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:46 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:14:46 np0005540741 systemd[1]: Started libpod-conmon-c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501.scope.
Dec  1 04:14:46 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:46 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20e61ba8de1fbeb5168aa88cff2ed4ee0816cb7774fc6eb6dcfa8fe3cc6e8a38/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:46 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20e61ba8de1fbeb5168aa88cff2ed4ee0816cb7774fc6eb6dcfa8fe3cc6e8a38/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:46 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20e61ba8de1fbeb5168aa88cff2ed4ee0816cb7774fc6eb6dcfa8fe3cc6e8a38/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:46 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20e61ba8de1fbeb5168aa88cff2ed4ee0816cb7774fc6eb6dcfa8fe3cc6e8a38/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:46 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20e61ba8de1fbeb5168aa88cff2ed4ee0816cb7774fc6eb6dcfa8fe3cc6e8a38/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:46 np0005540741 podman[94698]: 2025-12-01 09:14:46.189502247 +0000 UTC m=+0.310237427 container init c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_curran, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default)
Dec  1 04:14:46 np0005540741 podman[94698]: 2025-12-01 09:14:46.198054987 +0000 UTC m=+0.318790127 container start c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_curran, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:46 np0005540741 podman[94698]: 2025-12-01 09:14:46.201548904 +0000 UTC m=+0.322284124 container attach c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_curran, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:14:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v58: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:14:47 np0005540741 suspicious_curran[94714]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:14:47 np0005540741 suspicious_curran[94714]: --> relative data size: 1.0
Dec  1 04:14:47 np0005540741 suspicious_curran[94714]: --> All data devices are unavailable
Dec  1 04:14:47 np0005540741 systemd[1]: libpod-c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501.scope: Deactivated successfully.
Dec  1 04:14:47 np0005540741 systemd[1]: libpod-c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501.scope: Consumed 1.227s CPU time.
Dec  1 04:14:47 np0005540741 podman[94698]: 2025-12-01 09:14:47.46315382 +0000 UTC m=+1.583888960 container died c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_curran, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:47 np0005540741 systemd[1]: var-lib-containers-storage-overlay-20e61ba8de1fbeb5168aa88cff2ed4ee0816cb7774fc6eb6dcfa8fe3cc6e8a38-merged.mount: Deactivated successfully.
Dec  1 04:14:47 np0005540741 podman[94698]: 2025-12-01 09:14:47.587583353 +0000 UTC m=+1.708318483 container remove c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_curran, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec  1 04:14:47 np0005540741 systemd[1]: libpod-conmon-c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501.scope: Deactivated successfully.
Dec  1 04:14:47 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e20 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:14:48 np0005540741 podman[94898]: 2025-12-01 09:14:48.277256506 +0000 UTC m=+0.055764011 container create b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_lichterman, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True)
Dec  1 04:14:48 np0005540741 systemd[1]: Started libpod-conmon-b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32.scope.
Dec  1 04:14:48 np0005540741 podman[94898]: 2025-12-01 09:14:48.254551504 +0000 UTC m=+0.033059059 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:48 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:48 np0005540741 podman[94898]: 2025-12-01 09:14:48.36891749 +0000 UTC m=+0.147425035 container init b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_lichterman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:14:48 np0005540741 podman[94898]: 2025-12-01 09:14:48.37746418 +0000 UTC m=+0.155971685 container start b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_lichterman, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:48 np0005540741 podman[94898]: 2025-12-01 09:14:48.381617977 +0000 UTC m=+0.160125482 container attach b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_lichterman, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:14:48 np0005540741 unruffled_lichterman[94914]: 167 167
Dec  1 04:14:48 np0005540741 systemd[1]: libpod-b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32.scope: Deactivated successfully.
Dec  1 04:14:48 np0005540741 conmon[94914]: conmon b105126fc98498345ed8 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32.scope/container/memory.events
Dec  1 04:14:48 np0005540741 podman[94898]: 2025-12-01 09:14:48.388135656 +0000 UTC m=+0.166643171 container died b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_lichterman, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:14:48 np0005540741 systemd[1]: var-lib-containers-storage-overlay-d0d9fe6c1b686670527a99430fdefb67465b531d692cdf50f4dcac28916066e3-merged.mount: Deactivated successfully.
Dec  1 04:14:48 np0005540741 podman[94898]: 2025-12-01 09:14:48.425262637 +0000 UTC m=+0.203770142 container remove b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_lichterman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec  1 04:14:48 np0005540741 systemd[1]: libpod-conmon-b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32.scope: Deactivated successfully.
Dec  1 04:14:48 np0005540741 podman[94938]: 2025-12-01 09:14:48.583868712 +0000 UTC m=+0.037430902 container create 5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_hermann, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:48 np0005540741 systemd[1]: Started libpod-conmon-5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1.scope.
Dec  1 04:14:48 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:48 np0005540741 podman[94938]: 2025-12-01 09:14:48.568689989 +0000 UTC m=+0.022252199 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d11d224c2f29d195088492fa8fb5ca6c475aad7c5bb1b2263a519bcaf46b3a98/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d11d224c2f29d195088492fa8fb5ca6c475aad7c5bb1b2263a519bcaf46b3a98/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d11d224c2f29d195088492fa8fb5ca6c475aad7c5bb1b2263a519bcaf46b3a98/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d11d224c2f29d195088492fa8fb5ca6c475aad7c5bb1b2263a519bcaf46b3a98/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:48 np0005540741 podman[94938]: 2025-12-01 09:14:48.675618199 +0000 UTC m=+0.129180399 container init 5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_hermann, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec  1 04:14:48 np0005540741 podman[94938]: 2025-12-01 09:14:48.688209052 +0000 UTC m=+0.141771252 container start 5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_hermann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:14:48 np0005540741 podman[94938]: 2025-12-01 09:14:48.692080051 +0000 UTC m=+0.145642241 container attach 5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_hermann, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:14:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:14:49 np0005540741 python3[94985]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   status --format json | jq .osdmap.num_up_osds _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:14:49 np0005540741 podman[94987]: 2025-12-01 09:14:49.267957265 +0000 UTC m=+0.060678571 container create 859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c (image=quay.io/ceph/ceph:v18, name=hardcore_elbakyan, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:14:49 np0005540741 systemd[1]: Started libpod-conmon-859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c.scope.
Dec  1 04:14:49 np0005540741 podman[94987]: 2025-12-01 09:14:49.238580419 +0000 UTC m=+0.031301785 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:14:49 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:49 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f06b18828f78f6cfa8695665f31ff023fed42c080f6fb35970bc8cb2afa30d2/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:49 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f06b18828f78f6cfa8695665f31ff023fed42c080f6fb35970bc8cb2afa30d2/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:49 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f06b18828f78f6cfa8695665f31ff023fed42c080f6fb35970bc8cb2afa30d2/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:49 np0005540741 podman[94987]: 2025-12-01 09:14:49.36163502 +0000 UTC m=+0.154356326 container init 859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c (image=quay.io/ceph/ceph:v18, name=hardcore_elbakyan, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec  1 04:14:49 np0005540741 podman[94987]: 2025-12-01 09:14:49.36850914 +0000 UTC m=+0.161230446 container start 859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c (image=quay.io/ceph/ceph:v18, name=hardcore_elbakyan, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:49 np0005540741 podman[94987]: 2025-12-01 09:14:49.373060438 +0000 UTC m=+0.165781764 container attach 859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c (image=quay.io/ceph/ceph:v18, name=hardcore_elbakyan, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True)
Dec  1 04:14:49 np0005540741 great_hermann[94955]: {
Dec  1 04:14:49 np0005540741 great_hermann[94955]:    "0": [
Dec  1 04:14:49 np0005540741 great_hermann[94955]:        {
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "devices": [
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "/dev/loop3"
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            ],
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "lv_name": "ceph_lv0",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "lv_size": "21470642176",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "name": "ceph_lv0",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "tags": {
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.cluster_name": "ceph",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.crush_device_class": "",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.encrypted": "0",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.osd_id": "0",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.type": "block",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.vdo": "0"
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            },
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "type": "block",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "vg_name": "ceph_vg0"
Dec  1 04:14:49 np0005540741 great_hermann[94955]:        }
Dec  1 04:14:49 np0005540741 great_hermann[94955]:    ],
Dec  1 04:14:49 np0005540741 great_hermann[94955]:    "1": [
Dec  1 04:14:49 np0005540741 great_hermann[94955]:        {
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "devices": [
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "/dev/loop4"
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            ],
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "lv_name": "ceph_lv1",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "lv_size": "21470642176",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "name": "ceph_lv1",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "tags": {
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.cluster_name": "ceph",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.crush_device_class": "",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.encrypted": "0",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.osd_id": "1",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.type": "block",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.vdo": "0"
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            },
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "type": "block",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "vg_name": "ceph_vg1"
Dec  1 04:14:49 np0005540741 great_hermann[94955]:        }
Dec  1 04:14:49 np0005540741 great_hermann[94955]:    ],
Dec  1 04:14:49 np0005540741 great_hermann[94955]:    "2": [
Dec  1 04:14:49 np0005540741 great_hermann[94955]:        {
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "devices": [
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "/dev/loop5"
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            ],
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "lv_name": "ceph_lv2",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "lv_size": "21470642176",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "name": "ceph_lv2",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "tags": {
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.cluster_name": "ceph",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.crush_device_class": "",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.encrypted": "0",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.osd_id": "2",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.type": "block",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:                "ceph.vdo": "0"
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            },
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "type": "block",
Dec  1 04:14:49 np0005540741 great_hermann[94955]:            "vg_name": "ceph_vg2"
Dec  1 04:14:49 np0005540741 great_hermann[94955]:        }
Dec  1 04:14:49 np0005540741 great_hermann[94955]:    ]
Dec  1 04:14:49 np0005540741 great_hermann[94955]: }
Dec  1 04:14:49 np0005540741 systemd[1]: libpod-5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1.scope: Deactivated successfully.
Dec  1 04:14:49 np0005540741 podman[94938]: 2025-12-01 09:14:49.538866692 +0000 UTC m=+0.992428902 container died 5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_hermann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec  1 04:14:49 np0005540741 systemd[1]: var-lib-containers-storage-overlay-d11d224c2f29d195088492fa8fb5ca6c475aad7c5bb1b2263a519bcaf46b3a98-merged.mount: Deactivated successfully.
Dec  1 04:14:49 np0005540741 podman[94938]: 2025-12-01 09:14:49.591258349 +0000 UTC m=+1.044820539 container remove 5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_hermann, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:14:49 np0005540741 systemd[1]: libpod-conmon-5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1.scope: Deactivated successfully.
Dec  1 04:14:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json"} v 0) v1
Dec  1 04:14:49 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1076553885' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec  1 04:14:49 np0005540741 hardcore_elbakyan[95003]: 
Dec  1 04:14:49 np0005540741 hardcore_elbakyan[95003]: {"fsid":"5620a9fb-e540-5250-a0e8-7aaad5347e3b","health":{"status":"HEALTH_OK","checks":{},"mutes":[]},"election_epoch":5,"quorum":[0],"quorum_names":["compute-0"],"quorum_age":147,"monmap":{"epoch":1,"min_mon_release_name":"reef","num_mons":1},"osdmap":{"epoch":20,"num_osds":3,"num_up_osds":3,"osd_up_since":1764580474,"num_in_osds":3,"osd_in_since":1764580437,"num_remapped_pgs":0},"pgmap":{"pgs_by_state":[{"state_name":"active+clean","count":1}],"num_pgs":1,"num_pools":1,"num_objects":2,"data_bytes":459280,"bytes_used":502837248,"bytes_avail":63909089280,"bytes_total":64411926528},"fsmap":{"epoch":1,"by_rank":[],"up:standby":0},"mgrmap":{"available":true,"num_standbys":0,"modules":["cephadm","iostat","nfs","restful"],"services":{}},"servicemap":{"epoch":2,"modified":"2025-12-01T09:14:14.959091+0000","services":{}},"progress_events":{}}
Dec  1 04:14:50 np0005540741 systemd[1]: libpod-859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c.scope: Deactivated successfully.
Dec  1 04:14:50 np0005540741 podman[94987]: 2025-12-01 09:14:50.014262383 +0000 UTC m=+0.806983699 container died 859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c (image=quay.io/ceph/ceph:v18, name=hardcore_elbakyan, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:50 np0005540741 systemd[1]: var-lib-containers-storage-overlay-1f06b18828f78f6cfa8695665f31ff023fed42c080f6fb35970bc8cb2afa30d2-merged.mount: Deactivated successfully.
Dec  1 04:14:50 np0005540741 podman[94987]: 2025-12-01 09:14:50.072371674 +0000 UTC m=+0.865093010 container remove 859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c (image=quay.io/ceph/ceph:v18, name=hardcore_elbakyan, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:14:50 np0005540741 systemd[1]: libpod-conmon-859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c.scope: Deactivated successfully.
Dec  1 04:14:50 np0005540741 podman[95201]: 2025-12-01 09:14:50.197341713 +0000 UTC m=+0.037007169 container create 262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_khorana, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:50 np0005540741 systemd[1]: Started libpod-conmon-262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968.scope.
Dec  1 04:14:50 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:50 np0005540741 podman[95201]: 2025-12-01 09:14:50.267673177 +0000 UTC m=+0.107338653 container init 262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_khorana, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec  1 04:14:50 np0005540741 podman[95201]: 2025-12-01 09:14:50.274215247 +0000 UTC m=+0.113880713 container start 262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_khorana, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:50 np0005540741 podman[95201]: 2025-12-01 09:14:50.181516711 +0000 UTC m=+0.021182187 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:50 np0005540741 podman[95201]: 2025-12-01 09:14:50.277469326 +0000 UTC m=+0.117134802 container attach 262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_khorana, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  1 04:14:50 np0005540741 goofy_khorana[95217]: 167 167
Dec  1 04:14:50 np0005540741 systemd[1]: libpod-262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968.scope: Deactivated successfully.
Dec  1 04:14:50 np0005540741 podman[95201]: 2025-12-01 09:14:50.278771165 +0000 UTC m=+0.118436621 container died 262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_khorana, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:50 np0005540741 systemd[1]: var-lib-containers-storage-overlay-063b5f59300888895678bf4cc9ecb77d88e38787a81fdfd3cc791f080adbb036-merged.mount: Deactivated successfully.
Dec  1 04:14:50 np0005540741 podman[95201]: 2025-12-01 09:14:50.312518904 +0000 UTC m=+0.152184360 container remove 262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_khorana, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Dec  1 04:14:50 np0005540741 systemd[1]: libpod-conmon-262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968.scope: Deactivated successfully.
Dec  1 04:14:50 np0005540741 podman[95265]: 2025-12-01 09:14:50.47148664 +0000 UTC m=+0.043214318 container create 87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_lamport, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec  1 04:14:50 np0005540741 systemd[1]: Started libpod-conmon-87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5.scope.
Dec  1 04:14:50 np0005540741 python3[95259]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create vms  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:14:50 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:50 np0005540741 podman[95265]: 2025-12-01 09:14:50.453264184 +0000 UTC m=+0.024991882 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:14:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/493438c4c795cc66904661c5ff6827a8fa365df31329a35dc3777a38425c2dd4/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/493438c4c795cc66904661c5ff6827a8fa365df31329a35dc3777a38425c2dd4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/493438c4c795cc66904661c5ff6827a8fa365df31329a35dc3777a38425c2dd4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/493438c4c795cc66904661c5ff6827a8fa365df31329a35dc3777a38425c2dd4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:50 np0005540741 podman[95265]: 2025-12-01 09:14:50.567397883 +0000 UTC m=+0.139125581 container init 87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_lamport, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:14:50 np0005540741 podman[95265]: 2025-12-01 09:14:50.57450736 +0000 UTC m=+0.146235038 container start 87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_lamport, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:50 np0005540741 podman[95265]: 2025-12-01 09:14:50.579871084 +0000 UTC m=+0.151598782 container attach 87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_lamport, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:50 np0005540741 podman[95284]: 2025-12-01 09:14:50.588010752 +0000 UTC m=+0.043039133 container create fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68 (image=quay.io/ceph/ceph:v18, name=sleepy_gates, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:50 np0005540741 systemd[1]: Started libpod-conmon-fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68.scope.
Dec  1 04:14:50 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:50 np0005540741 podman[95284]: 2025-12-01 09:14:50.569199508 +0000 UTC m=+0.024227899 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:14:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e066a1ec59fc60d27d3ea1360a65001911a92d455ed84fba42bebfd09656328/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e066a1ec59fc60d27d3ea1360a65001911a92d455ed84fba42bebfd09656328/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:50 np0005540741 podman[95284]: 2025-12-01 09:14:50.67718814 +0000 UTC m=+0.132216541 container init fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68 (image=quay.io/ceph/ceph:v18, name=sleepy_gates, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:14:50 np0005540741 podman[95284]: 2025-12-01 09:14:50.685457962 +0000 UTC m=+0.140486343 container start fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68 (image=quay.io/ceph/ceph:v18, name=sleepy_gates, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3)
Dec  1 04:14:50 np0005540741 podman[95284]: 2025-12-01 09:14:50.689209847 +0000 UTC m=+0.144238258 container attach fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68 (image=quay.io/ceph/ceph:v18, name=sleepy_gates, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec  1 04:14:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v60: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:14:51 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0) v1
Dec  1 04:14:51 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/4139565444' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]: {
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:        "osd_id": 0,
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:        "type": "bluestore"
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:    },
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:        "osd_id": 1,
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:        "type": "bluestore"
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:    },
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:        "osd_id": 2,
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:        "type": "bluestore"
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]:    }
Dec  1 04:14:51 np0005540741 amazing_lamport[95282]: }
Dec  1 04:14:51 np0005540741 systemd[1]: libpod-87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5.scope: Deactivated successfully.
Dec  1 04:14:51 np0005540741 systemd[1]: libpod-87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5.scope: Consumed 1.307s CPU time.
Dec  1 04:14:51 np0005540741 podman[95265]: 2025-12-01 09:14:51.890456823 +0000 UTC m=+1.462184511 container died 87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_lamport, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:51 np0005540741 systemd[1]: var-lib-containers-storage-overlay-493438c4c795cc66904661c5ff6827a8fa365df31329a35dc3777a38425c2dd4-merged.mount: Deactivated successfully.
Dec  1 04:14:51 np0005540741 podman[95265]: 2025-12-01 09:14:51.93891241 +0000 UTC m=+1.510640088 container remove 87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_lamport, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec  1 04:14:51 np0005540741 systemd[1]: libpod-conmon-87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5.scope: Deactivated successfully.
Dec  1 04:14:51 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:14:51 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:51 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:14:51 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:52 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e20 do_prune osdmap full prune enabled
Dec  1 04:14:52 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/4139565444' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec  1 04:14:52 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:52 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:14:52 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/4139565444' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec  1 04:14:52 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e21 e21: 3 total, 3 up, 3 in
Dec  1 04:14:52 np0005540741 sleepy_gates[95302]: pool 'vms' created
Dec  1 04:14:52 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e21: 3 total, 3 up, 3 in
Dec  1 04:14:52 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 21 pg[2.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [2] r=0 lpr=21 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:14:52 np0005540741 systemd[1]: libpod-fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68.scope: Deactivated successfully.
Dec  1 04:14:52 np0005540741 podman[95284]: 2025-12-01 09:14:52.195616005 +0000 UTC m=+1.650644396 container died fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68 (image=quay.io/ceph/ceph:v18, name=sleepy_gates, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec  1 04:14:52 np0005540741 systemd[1]: var-lib-containers-storage-overlay-0e066a1ec59fc60d27d3ea1360a65001911a92d455ed84fba42bebfd09656328-merged.mount: Deactivated successfully.
Dec  1 04:14:52 np0005540741 podman[95284]: 2025-12-01 09:14:52.243491784 +0000 UTC m=+1.698520165 container remove fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68 (image=quay.io/ceph/ceph:v18, name=sleepy_gates, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec  1 04:14:52 np0005540741 systemd[1]: libpod-conmon-fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68.scope: Deactivated successfully.
Dec  1 04:14:52 np0005540741 python3[95460]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create volumes  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:14:52 np0005540741 podman[95461]: 2025-12-01 09:14:52.626347855 +0000 UTC m=+0.044887300 container create f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb (image=quay.io/ceph/ceph:v18, name=epic_dewdney, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec  1 04:14:52 np0005540741 systemd[1]: Started libpod-conmon-f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb.scope.
Dec  1 04:14:52 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/153f7faa2662baaae11662c450fe794fd8815f849dd49e3c9f1931ee97d069d7/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/153f7faa2662baaae11662c450fe794fd8815f849dd49e3c9f1931ee97d069d7/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:52 np0005540741 podman[95461]: 2025-12-01 09:14:52.697143753 +0000 UTC m=+0.115683208 container init f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb (image=quay.io/ceph/ceph:v18, name=epic_dewdney, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True)
Dec  1 04:14:52 np0005540741 podman[95461]: 2025-12-01 09:14:52.607910063 +0000 UTC m=+0.026449518 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:14:52 np0005540741 podman[95461]: 2025-12-01 09:14:52.707399405 +0000 UTC m=+0.125938840 container start f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb (image=quay.io/ceph/ceph:v18, name=epic_dewdney, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  1 04:14:52 np0005540741 podman[95461]: 2025-12-01 09:14:52.710798569 +0000 UTC m=+0.129338014 container attach f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb (image=quay.io/ceph/ceph:v18, name=epic_dewdney, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec  1 04:14:52 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e21 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:14:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v62: 2 pgs: 1 active+clean, 1 unknown; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:14:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e21 do_prune osdmap full prune enabled
Dec  1 04:14:53 np0005540741 ceph-mon[75031]: log_channel(cluster) log [WRN] : Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  1 04:14:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e22 e22: 3 total, 3 up, 3 in
Dec  1 04:14:53 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e22: 3 total, 3 up, 3 in
Dec  1 04:14:53 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/4139565444' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec  1 04:14:53 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 22 pg[2.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [2] r=0 lpr=21 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:14:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0) v1
Dec  1 04:14:53 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/60547159' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec  1 04:14:54 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e22 do_prune osdmap full prune enabled
Dec  1 04:14:54 np0005540741 ceph-mon[75031]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  1 04:14:54 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/60547159' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec  1 04:14:54 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/60547159' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec  1 04:14:54 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e23 e23: 3 total, 3 up, 3 in
Dec  1 04:14:54 np0005540741 epic_dewdney[95477]: pool 'volumes' created
Dec  1 04:14:54 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e23: 3 total, 3 up, 3 in
Dec  1 04:14:54 np0005540741 systemd[1]: libpod-f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb.scope: Deactivated successfully.
Dec  1 04:14:54 np0005540741 podman[95461]: 2025-12-01 09:14:54.230995287 +0000 UTC m=+1.649534762 container died f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb (image=quay.io/ceph/ceph:v18, name=epic_dewdney, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:54 np0005540741 systemd[1]: var-lib-containers-storage-overlay-153f7faa2662baaae11662c450fe794fd8815f849dd49e3c9f1931ee97d069d7-merged.mount: Deactivated successfully.
Dec  1 04:14:54 np0005540741 podman[95461]: 2025-12-01 09:14:54.281128035 +0000 UTC m=+1.699667470 container remove f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb (image=quay.io/ceph/ceph:v18, name=epic_dewdney, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:14:54 np0005540741 systemd[1]: libpod-conmon-f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb.scope: Deactivated successfully.
Dec  1 04:14:54 np0005540741 python3[95538]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create backups  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:14:54 np0005540741 podman[95539]: 2025-12-01 09:14:54.632992941 +0000 UTC m=+0.042183937 container create c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a (image=quay.io/ceph/ceph:v18, name=vigorous_noether, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec  1 04:14:54 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 23 pg[3.0( empty local-lis/les=0/0 n=0 ec=23/23 lis/c=0/0 les/c/f=0/0/0 sis=23) [1] r=0 lpr=23 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:14:54 np0005540741 systemd[1]: Started libpod-conmon-c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a.scope.
Dec  1 04:14:54 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:54 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa5d8d0787aa7ef8c40321ce0843b610a3337d9a2a44e6a532cf967ef0bc8355/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:54 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa5d8d0787aa7ef8c40321ce0843b610a3337d9a2a44e6a532cf967ef0bc8355/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:54 np0005540741 podman[95539]: 2025-12-01 09:14:54.698883499 +0000 UTC m=+0.108074515 container init c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a (image=quay.io/ceph/ceph:v18, name=vigorous_noether, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:54 np0005540741 podman[95539]: 2025-12-01 09:14:54.705354656 +0000 UTC m=+0.114545652 container start c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a (image=quay.io/ceph/ceph:v18, name=vigorous_noether, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:54 np0005540741 podman[95539]: 2025-12-01 09:14:54.708609836 +0000 UTC m=+0.117800852 container attach c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a (image=quay.io/ceph/ceph:v18, name=vigorous_noether, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:14:54 np0005540741 podman[95539]: 2025-12-01 09:14:54.613908049 +0000 UTC m=+0.023099075 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:14:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v65: 3 pgs: 1 active+clean, 2 unknown; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:14:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e23 do_prune osdmap full prune enabled
Dec  1 04:14:55 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/60547159' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec  1 04:14:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e24 e24: 3 total, 3 up, 3 in
Dec  1 04:14:55 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e24: 3 total, 3 up, 3 in
Dec  1 04:14:55 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 24 pg[3.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=0/0 les/c/f=0/0/0 sis=23) [1] r=0 lpr=23 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:14:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0) v1
Dec  1 04:14:55 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3505839965' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec  1 04:14:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e24 do_prune osdmap full prune enabled
Dec  1 04:14:56 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3505839965' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec  1 04:14:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e25 e25: 3 total, 3 up, 3 in
Dec  1 04:14:56 np0005540741 vigorous_noether[95555]: pool 'backups' created
Dec  1 04:14:56 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e25: 3 total, 3 up, 3 in
Dec  1 04:14:56 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/3505839965' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec  1 04:14:56 np0005540741 systemd[1]: libpod-c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a.scope: Deactivated successfully.
Dec  1 04:14:56 np0005540741 podman[95539]: 2025-12-01 09:14:56.240741679 +0000 UTC m=+1.649932675 container died c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a (image=quay.io/ceph/ceph:v18, name=vigorous_noether, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Dec  1 04:14:56 np0005540741 systemd[1]: var-lib-containers-storage-overlay-aa5d8d0787aa7ef8c40321ce0843b610a3337d9a2a44e6a532cf967ef0bc8355-merged.mount: Deactivated successfully.
Dec  1 04:14:56 np0005540741 podman[95539]: 2025-12-01 09:14:56.285696469 +0000 UTC m=+1.694887465 container remove c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a (image=quay.io/ceph/ceph:v18, name=vigorous_noether, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec  1 04:14:56 np0005540741 systemd[1]: libpod-conmon-c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a.scope: Deactivated successfully.
Dec  1 04:14:56 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 25 pg[4.0( empty local-lis/les=0/0 n=0 ec=25/25 lis/c=0/0 les/c/f=0/0/0 sis=25) [0] r=0 lpr=25 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:14:56 np0005540741 python3[95619]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create images  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:14:56 np0005540741 podman[95620]: 2025-12-01 09:14:56.631891542 +0000 UTC m=+0.050305215 container create 474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018 (image=quay.io/ceph/ceph:v18, name=jovial_faraday, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:56 np0005540741 systemd[1]: Started libpod-conmon-474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018.scope.
Dec  1 04:14:56 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:56 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1416c27369da903e47d370554a4469bd50572f8bdcac8fc960db13d06d605bdb/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:56 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1416c27369da903e47d370554a4469bd50572f8bdcac8fc960db13d06d605bdb/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:56 np0005540741 podman[95620]: 2025-12-01 09:14:56.604622271 +0000 UTC m=+0.023036034 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:14:56 np0005540741 podman[95620]: 2025-12-01 09:14:56.708179526 +0000 UTC m=+0.126593209 container init 474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018 (image=quay.io/ceph/ceph:v18, name=jovial_faraday, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:14:56 np0005540741 podman[95620]: 2025-12-01 09:14:56.71388829 +0000 UTC m=+0.132301963 container start 474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018 (image=quay.io/ceph/ceph:v18, name=jovial_faraday, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2)
Dec  1 04:14:56 np0005540741 podman[95620]: 2025-12-01 09:14:56.717188601 +0000 UTC m=+0.135602484 container attach 474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018 (image=quay.io/ceph/ceph:v18, name=jovial_faraday, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v68: 4 pgs: 2 active+clean, 2 unknown; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:14:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e25 do_prune osdmap full prune enabled
Dec  1 04:14:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e26 e26: 3 total, 3 up, 3 in
Dec  1 04:14:57 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/3505839965' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec  1 04:14:57 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e26: 3 total, 3 up, 3 in
Dec  1 04:14:57 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 26 pg[4.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=0/0 les/c/f=0/0/0 sis=25) [0] r=0 lpr=25 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:14:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0) v1
Dec  1 04:14:57 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/616124501' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec  1 04:14:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e26 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:14:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e26 do_prune osdmap full prune enabled
Dec  1 04:14:58 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/616124501' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec  1 04:14:58 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/616124501' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec  1 04:14:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e27 e27: 3 total, 3 up, 3 in
Dec  1 04:14:58 np0005540741 jovial_faraday[95635]: pool 'images' created
Dec  1 04:14:58 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e27: 3 total, 3 up, 3 in
Dec  1 04:14:58 np0005540741 systemd[1]: libpod-474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018.scope: Deactivated successfully.
Dec  1 04:14:58 np0005540741 podman[95620]: 2025-12-01 09:14:58.279782442 +0000 UTC m=+1.698196115 container died 474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018 (image=quay.io/ceph/ceph:v18, name=jovial_faraday, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec  1 04:14:58 np0005540741 systemd[1]: var-lib-containers-storage-overlay-1416c27369da903e47d370554a4469bd50572f8bdcac8fc960db13d06d605bdb-merged.mount: Deactivated successfully.
Dec  1 04:14:58 np0005540741 podman[95620]: 2025-12-01 09:14:58.326393093 +0000 UTC m=+1.744806786 container remove 474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018 (image=quay.io/ceph/ceph:v18, name=jovial_faraday, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:58 np0005540741 systemd[1]: libpod-conmon-474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018.scope: Deactivated successfully.
Dec  1 04:14:58 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 27 pg[5.0( empty local-lis/les=0/0 n=0 ec=27/27 lis/c=0/0 les/c/f=0/0/0 sis=27) [2] r=0 lpr=27 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:14:58 np0005540741 python3[95700]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create cephfs.cephfs.meta  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:14:58 np0005540741 podman[95701]: 2025-12-01 09:14:58.677193646 +0000 UTC m=+0.051344686 container create b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8 (image=quay.io/ceph/ceph:v18, name=ecstatic_jepsen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 04:14:58 np0005540741 systemd[1]: Started libpod-conmon-b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8.scope.
Dec  1 04:14:58 np0005540741 podman[95701]: 2025-12-01 09:14:58.656797844 +0000 UTC m=+0.030948904 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:14:58 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:14:58 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85fc54dc345fd5ed408303be7b4539741a67940f7872e9f53e03cc1153b2ea16/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:58 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85fc54dc345fd5ed408303be7b4539741a67940f7872e9f53e03cc1153b2ea16/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:14:58 np0005540741 podman[95701]: 2025-12-01 09:14:58.773217803 +0000 UTC m=+0.147368873 container init b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8 (image=quay.io/ceph/ceph:v18, name=ecstatic_jepsen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:58 np0005540741 podman[95701]: 2025-12-01 09:14:58.778495594 +0000 UTC m=+0.152646634 container start b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8 (image=quay.io/ceph/ceph:v18, name=ecstatic_jepsen, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:14:58 np0005540741 podman[95701]: 2025-12-01 09:14:58.782396283 +0000 UTC m=+0.156547353 container attach b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8 (image=quay.io/ceph/ceph:v18, name=ecstatic_jepsen, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:14:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v71: 5 pgs: 3 active+clean, 2 unknown; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:14:59 np0005540741 ceph-mon[75031]: log_channel(cluster) log [WRN] : Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  1 04:14:59 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e27 do_prune osdmap full prune enabled
Dec  1 04:14:59 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e28 e28: 3 total, 3 up, 3 in
Dec  1 04:14:59 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/616124501' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec  1 04:14:59 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e28: 3 total, 3 up, 3 in
Dec  1 04:14:59 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 28 pg[5.0( empty local-lis/les=27/28 n=0 ec=27/27 lis/c=0/0 les/c/f=0/0/0 sis=27) [2] r=0 lpr=27 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:14:59 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0) v1
Dec  1 04:14:59 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2536440165' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec  1 04:15:00 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e28 do_prune osdmap full prune enabled
Dec  1 04:15:00 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2536440165' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec  1 04:15:00 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e29 e29: 3 total, 3 up, 3 in
Dec  1 04:15:00 np0005540741 ecstatic_jepsen[95716]: pool 'cephfs.cephfs.meta' created
Dec  1 04:15:00 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e29: 3 total, 3 up, 3 in
Dec  1 04:15:00 np0005540741 ceph-mon[75031]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  1 04:15:00 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/2536440165' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec  1 04:15:00 np0005540741 systemd[1]: libpod-b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8.scope: Deactivated successfully.
Dec  1 04:15:00 np0005540741 podman[95701]: 2025-12-01 09:15:00.30059221 +0000 UTC m=+1.674743250 container died b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8 (image=quay.io/ceph/ceph:v18, name=ecstatic_jepsen, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  1 04:15:00 np0005540741 systemd[1]: var-lib-containers-storage-overlay-85fc54dc345fd5ed408303be7b4539741a67940f7872e9f53e03cc1153b2ea16-merged.mount: Deactivated successfully.
Dec  1 04:15:00 np0005540741 podman[95701]: 2025-12-01 09:15:00.38293246 +0000 UTC m=+1.757083500 container remove b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8 (image=quay.io/ceph/ceph:v18, name=ecstatic_jepsen, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:00 np0005540741 systemd[1]: libpod-conmon-b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8.scope: Deactivated successfully.
Dec  1 04:15:00 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 29 pg[6.0( empty local-lis/les=0/0 n=0 ec=29/29 lis/c=0/0 les/c/f=0/0/0 sis=29) [0] r=0 lpr=29 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:00 np0005540741 python3[95782]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create cephfs.cephfs.data  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:00 np0005540741 podman[95783]: 2025-12-01 09:15:00.767572454 +0000 UTC m=+0.021754904 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:00 np0005540741 podman[95783]: 2025-12-01 09:15:00.876594838 +0000 UTC m=+0.130777308 container create e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740 (image=quay.io/ceph/ceph:v18, name=elated_sanderson, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:00 np0005540741 systemd[1]: Started libpod-conmon-e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740.scope.
Dec  1 04:15:00 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:00 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0a8ec137484ab4f3843ac06a7f950734647c9e87c6b3a3b1477e6e8ca744698/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:00 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0a8ec137484ab4f3843ac06a7f950734647c9e87c6b3a3b1477e6e8ca744698/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:00 np0005540741 podman[95783]: 2025-12-01 09:15:00.956505323 +0000 UTC m=+0.210687773 container init e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740 (image=quay.io/ceph/ceph:v18, name=elated_sanderson, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec  1 04:15:00 np0005540741 podman[95783]: 2025-12-01 09:15:00.962405513 +0000 UTC m=+0.216587943 container start e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740 (image=quay.io/ceph/ceph:v18, name=elated_sanderson, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 04:15:00 np0005540741 podman[95783]: 2025-12-01 09:15:00.965924971 +0000 UTC m=+0.220107421 container attach e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740 (image=quay.io/ceph/ceph:v18, name=elated_sanderson, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec  1 04:15:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v74: 6 pgs: 1 unknown, 5 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e29 do_prune osdmap full prune enabled
Dec  1 04:15:01 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/2536440165' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec  1 04:15:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e30 e30: 3 total, 3 up, 3 in
Dec  1 04:15:01 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e30: 3 total, 3 up, 3 in
Dec  1 04:15:01 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 30 pg[6.0( empty local-lis/les=29/30 n=0 ec=29/29 lis/c=0/0 les/c/f=0/0/0 sis=29) [0] r=0 lpr=29 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0) v1
Dec  1 04:15:01 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1765389376' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec  1 04:15:02 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e30 do_prune osdmap full prune enabled
Dec  1 04:15:02 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1765389376' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec  1 04:15:02 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e31 e31: 3 total, 3 up, 3 in
Dec  1 04:15:02 np0005540741 elated_sanderson[95799]: pool 'cephfs.cephfs.data' created
Dec  1 04:15:02 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e31: 3 total, 3 up, 3 in
Dec  1 04:15:02 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/1765389376' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec  1 04:15:02 np0005540741 systemd[1]: libpod-e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740.scope: Deactivated successfully.
Dec  1 04:15:02 np0005540741 podman[95783]: 2025-12-01 09:15:02.33555693 +0000 UTC m=+1.589739380 container died e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740 (image=quay.io/ceph/ceph:v18, name=elated_sanderson, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:02 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b0a8ec137484ab4f3843ac06a7f950734647c9e87c6b3a3b1477e6e8ca744698-merged.mount: Deactivated successfully.
Dec  1 04:15:02 np0005540741 podman[95783]: 2025-12-01 09:15:02.376456056 +0000 UTC m=+1.630638486 container remove e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740 (image=quay.io/ceph/ceph:v18, name=elated_sanderson, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:15:02 np0005540741 systemd[1]: libpod-conmon-e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740.scope: Deactivated successfully.
Dec  1 04:15:02 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 31 pg[7.0( empty local-lis/les=0/0 n=0 ec=31/31 lis/c=0/0 les/c/f=0/0/0 sis=31) [1] r=0 lpr=31 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:02 np0005540741 python3[95864]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable vms rbd _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:02 np0005540741 podman[95865]: 2025-12-01 09:15:02.772428786 +0000 UTC m=+0.044841257 container create de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7 (image=quay.io/ceph/ceph:v18, name=festive_mestorf, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True)
Dec  1 04:15:02 np0005540741 systemd[1]: Started libpod-conmon-de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7.scope.
Dec  1 04:15:02 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:02 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5487e1af34f86b0f3021e2871e9de50af06f5dbff044f7c05cda784cc1ae6630/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:02 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5487e1af34f86b0f3021e2871e9de50af06f5dbff044f7c05cda784cc1ae6630/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:02 np0005540741 podman[95865]: 2025-12-01 09:15:02.752590082 +0000 UTC m=+0.025002573 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:02 np0005540741 podman[95865]: 2025-12-01 09:15:02.853856239 +0000 UTC m=+0.126268740 container init de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7 (image=quay.io/ceph/ceph:v18, name=festive_mestorf, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True)
Dec  1 04:15:02 np0005540741 podman[95865]: 2025-12-01 09:15:02.860885843 +0000 UTC m=+0.133298334 container start de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7 (image=quay.io/ceph/ceph:v18, name=festive_mestorf, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:15:02 np0005540741 podman[95865]: 2025-12-01 09:15:02.864909525 +0000 UTC m=+0.137322057 container attach de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7 (image=quay.io/ceph/ceph:v18, name=festive_mestorf, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec  1 04:15:02 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:15:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v77: 7 pgs: 2 unknown, 5 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e31 do_prune osdmap full prune enabled
Dec  1 04:15:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e32 e32: 3 total, 3 up, 3 in
Dec  1 04:15:03 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/1765389376' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec  1 04:15:03 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e32: 3 total, 3 up, 3 in
Dec  1 04:15:03 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 32 pg[7.0( empty local-lis/les=31/32 n=0 ec=31/31 lis/c=0/0 les/c/f=0/0/0 sis=31) [1] r=0 lpr=31 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"} v 0) v1
Dec  1 04:15:03 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1407074296' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Dec  1 04:15:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e32 do_prune osdmap full prune enabled
Dec  1 04:15:04 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1407074296' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Dec  1 04:15:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e33 e33: 3 total, 3 up, 3 in
Dec  1 04:15:04 np0005540741 festive_mestorf[95880]: enabled application 'rbd' on pool 'vms'
Dec  1 04:15:04 np0005540741 systemd[1]: libpod-de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7.scope: Deactivated successfully.
Dec  1 04:15:04 np0005540741 podman[95865]: 2025-12-01 09:15:04.558888922 +0000 UTC m=+1.831301403 container died de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7 (image=quay.io/ceph/ceph:v18, name=festive_mestorf, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:15:04 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e33: 3 total, 3 up, 3 in
Dec  1 04:15:04 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/1407074296' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Dec  1 04:15:04 np0005540741 systemd[1]: var-lib-containers-storage-overlay-5487e1af34f86b0f3021e2871e9de50af06f5dbff044f7c05cda784cc1ae6630-merged.mount: Deactivated successfully.
Dec  1 04:15:04 np0005540741 podman[95865]: 2025-12-01 09:15:04.956693005 +0000 UTC m=+2.229105486 container remove de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7 (image=quay.io/ceph/ceph:v18, name=festive_mestorf, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True)
Dec  1 04:15:05 np0005540741 systemd[1]: libpod-conmon-de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7.scope: Deactivated successfully.
Dec  1 04:15:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v80: 7 pgs: 2 unknown, 5 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:05 np0005540741 python3[95943]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable volumes rbd _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:05 np0005540741 podman[95944]: 2025-12-01 09:15:05.298333786 +0000 UTC m=+0.041530089 container create eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289 (image=quay.io/ceph/ceph:v18, name=keen_lichterman, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:15:05 np0005540741 systemd[1]: Started libpod-conmon-eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289.scope.
Dec  1 04:15:05 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:05 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b06f4489a330341eb79e1af60872a39166144a92190e09b524d2833a5a074c23/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:05 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b06f4489a330341eb79e1af60872a39166144a92190e09b524d2833a5a074c23/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:05 np0005540741 podman[95944]: 2025-12-01 09:15:05.28143105 +0000 UTC m=+0.024627363 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:05 np0005540741 podman[95944]: 2025-12-01 09:15:05.391987348 +0000 UTC m=+0.135183671 container init eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289 (image=quay.io/ceph/ceph:v18, name=keen_lichterman, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True)
Dec  1 04:15:05 np0005540741 podman[95944]: 2025-12-01 09:15:05.397208534 +0000 UTC m=+0.140404837 container start eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289 (image=quay.io/ceph/ceph:v18, name=keen_lichterman, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec  1 04:15:05 np0005540741 podman[95944]: 2025-12-01 09:15:05.400514899 +0000 UTC m=+0.143711222 container attach eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289 (image=quay.io/ceph/ceph:v18, name=keen_lichterman, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec  1 04:15:05 np0005540741 ceph-mon[75031]: log_channel(cluster) log [WRN] : Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  1 04:15:05 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/1407074296' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Dec  1 04:15:06 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"} v 0) v1
Dec  1 04:15:06 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/4284969463' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Dec  1 04:15:06 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e33 do_prune osdmap full prune enabled
Dec  1 04:15:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v81: 7 pgs: 1 unknown, 6 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:07 np0005540741 ceph-mon[75031]: Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  1 04:15:07 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/4284969463' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Dec  1 04:15:07 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/4284969463' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Dec  1 04:15:07 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e34 e34: 3 total, 3 up, 3 in
Dec  1 04:15:07 np0005540741 keen_lichterman[95960]: enabled application 'rbd' on pool 'volumes'
Dec  1 04:15:07 np0005540741 systemd[1]: libpod-eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289.scope: Deactivated successfully.
Dec  1 04:15:07 np0005540741 conmon[95960]: conmon eb314f53a0fff35d5fe7 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289.scope/container/memory.events
Dec  1 04:15:07 np0005540741 podman[95944]: 2025-12-01 09:15:07.214478289 +0000 UTC m=+1.957674592 container died eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289 (image=quay.io/ceph/ceph:v18, name=keen_lichterman, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec  1 04:15:07 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e34: 3 total, 3 up, 3 in
Dec  1 04:15:07 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b06f4489a330341eb79e1af60872a39166144a92190e09b524d2833a5a074c23-merged.mount: Deactivated successfully.
Dec  1 04:15:07 np0005540741 podman[95944]: 2025-12-01 09:15:07.481497091 +0000 UTC m=+2.224693394 container remove eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289 (image=quay.io/ceph/ceph:v18, name=keen_lichterman, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  1 04:15:07 np0005540741 systemd[1]: libpod-conmon-eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289.scope: Deactivated successfully.
Dec  1 04:15:07 np0005540741 python3[96022]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable backups rbd _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:07 np0005540741 podman[96023]: 2025-12-01 09:15:07.851918795 +0000 UTC m=+0.048311344 container create 31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9 (image=quay.io/ceph/ceph:v18, name=heuristic_lehmann, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec  1 04:15:07 np0005540741 systemd[1]: Started libpod-conmon-31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9.scope.
Dec  1 04:15:07 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:07 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8528b24c36d5b81d162223350a8f63a8b0ec6e2d360df4ce267a3c19c93a49d6/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:07 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8528b24c36d5b81d162223350a8f63a8b0ec6e2d360df4ce267a3c19c93a49d6/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:07 np0005540741 podman[96023]: 2025-12-01 09:15:07.924576331 +0000 UTC m=+0.120968870 container init 31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9 (image=quay.io/ceph/ceph:v18, name=heuristic_lehmann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Dec  1 04:15:07 np0005540741 podman[96023]: 2025-12-01 09:15:07.831449205 +0000 UTC m=+0.027841744 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:07 np0005540741 podman[96023]: 2025-12-01 09:15:07.931668276 +0000 UTC m=+0.128060795 container start 31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9 (image=quay.io/ceph/ceph:v18, name=heuristic_lehmann, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec  1 04:15:07 np0005540741 podman[96023]: 2025-12-01 09:15:07.942241131 +0000 UTC m=+0.138633650 container attach 31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9 (image=quay.io/ceph/ceph:v18, name=heuristic_lehmann, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:15:07 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e34 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:15:08 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/4284969463' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Dec  1 04:15:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"} v 0) v1
Dec  1 04:15:08 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2566742220' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Dec  1 04:15:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v83: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e34 do_prune osdmap full prune enabled
Dec  1 04:15:09 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/2566742220' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Dec  1 04:15:09 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2566742220' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Dec  1 04:15:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e35 e35: 3 total, 3 up, 3 in
Dec  1 04:15:09 np0005540741 heuristic_lehmann[96038]: enabled application 'rbd' on pool 'backups'
Dec  1 04:15:09 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e35: 3 total, 3 up, 3 in
Dec  1 04:15:09 np0005540741 systemd[1]: libpod-31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9.scope: Deactivated successfully.
Dec  1 04:15:09 np0005540741 podman[96063]: 2025-12-01 09:15:09.262542566 +0000 UTC m=+0.023409594 container died 31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9 (image=quay.io/ceph/ceph:v18, name=heuristic_lehmann, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec  1 04:15:09 np0005540741 systemd[1]: var-lib-containers-storage-overlay-8528b24c36d5b81d162223350a8f63a8b0ec6e2d360df4ce267a3c19c93a49d6-merged.mount: Deactivated successfully.
Dec  1 04:15:09 np0005540741 podman[96063]: 2025-12-01 09:15:09.302352699 +0000 UTC m=+0.063219717 container remove 31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9 (image=quay.io/ceph/ceph:v18, name=heuristic_lehmann, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:15:09 np0005540741 systemd[1]: libpod-conmon-31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9.scope: Deactivated successfully.
Dec  1 04:15:09 np0005540741 python3[96103]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable images rbd _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:09 np0005540741 podman[96104]: 2025-12-01 09:15:09.656078024 +0000 UTC m=+0.045444573 container create b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d (image=quay.io/ceph/ceph:v18, name=eager_engelbart, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec  1 04:15:09 np0005540741 systemd[1]: Started libpod-conmon-b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d.scope.
Dec  1 04:15:09 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:09 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e2a33a845c37db36918105ca154bf9039932d469b00fbbf11c524feafcfd0a6/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:09 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e2a33a845c37db36918105ca154bf9039932d469b00fbbf11c524feafcfd0a6/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:09 np0005540741 podman[96104]: 2025-12-01 09:15:09.632255678 +0000 UTC m=+0.021622207 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:09 np0005540741 podman[96104]: 2025-12-01 09:15:09.73662464 +0000 UTC m=+0.125991169 container init b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d (image=quay.io/ceph/ceph:v18, name=eager_engelbart, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:09 np0005540741 podman[96104]: 2025-12-01 09:15:09.741658649 +0000 UTC m=+0.131025148 container start b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d (image=quay.io/ceph/ceph:v18, name=eager_engelbart, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  1 04:15:09 np0005540741 podman[96104]: 2025-12-01 09:15:09.745380637 +0000 UTC m=+0.134747146 container attach b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d (image=quay.io/ceph/ceph:v18, name=eager_engelbart, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:15:10 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/2566742220' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Dec  1 04:15:10 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "images", "app": "rbd"} v 0) v1
Dec  1 04:15:10 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3179720452' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Dec  1 04:15:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v85: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e35 do_prune osdmap full prune enabled
Dec  1 04:15:11 np0005540741 ceph-mon[75031]: log_channel(cluster) log [WRN] : Health check update: 3 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  1 04:15:11 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/3179720452' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Dec  1 04:15:11 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3179720452' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Dec  1 04:15:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e36 e36: 3 total, 3 up, 3 in
Dec  1 04:15:11 np0005540741 eager_engelbart[96119]: enabled application 'rbd' on pool 'images'
Dec  1 04:15:11 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e36: 3 total, 3 up, 3 in
Dec  1 04:15:11 np0005540741 systemd[1]: libpod-b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d.scope: Deactivated successfully.
Dec  1 04:15:11 np0005540741 conmon[96119]: conmon b6ba5c3d9be627014bef <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d.scope/container/memory.events
Dec  1 04:15:11 np0005540741 podman[96104]: 2025-12-01 09:15:11.256255839 +0000 UTC m=+1.645622348 container died b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d (image=quay.io/ceph/ceph:v18, name=eager_engelbart, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec  1 04:15:11 np0005540741 systemd[1]: var-lib-containers-storage-overlay-8e2a33a845c37db36918105ca154bf9039932d469b00fbbf11c524feafcfd0a6-merged.mount: Deactivated successfully.
Dec  1 04:15:11 np0005540741 podman[96104]: 2025-12-01 09:15:11.302917389 +0000 UTC m=+1.692283888 container remove b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d (image=quay.io/ceph/ceph:v18, name=eager_engelbart, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:15:11 np0005540741 systemd[1]: libpod-conmon-b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d.scope: Deactivated successfully.
Dec  1 04:15:11 np0005540741 python3[96183]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable cephfs.cephfs.meta cephfs _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:11 np0005540741 podman[96184]: 2025-12-01 09:15:11.619812325 +0000 UTC m=+0.041562970 container create d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1 (image=quay.io/ceph/ceph:v18, name=clever_snyder, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Dec  1 04:15:11 np0005540741 systemd[1]: Started libpod-conmon-d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1.scope.
Dec  1 04:15:11 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:11 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f3934b2808e71c83464f81ce06020197d8739b5ad35fa4ac7282c8a8421f7a3/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:11 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f3934b2808e71c83464f81ce06020197d8739b5ad35fa4ac7282c8a8421f7a3/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:11 np0005540741 podman[96184]: 2025-12-01 09:15:11.694685881 +0000 UTC m=+0.116436526 container init d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1 (image=quay.io/ceph/ceph:v18, name=clever_snyder, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:15:11 np0005540741 podman[96184]: 2025-12-01 09:15:11.602716603 +0000 UTC m=+0.024467258 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:11 np0005540741 podman[96184]: 2025-12-01 09:15:11.70125387 +0000 UTC m=+0.123004515 container start d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1 (image=quay.io/ceph/ceph:v18, name=clever_snyder, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef)
Dec  1 04:15:11 np0005540741 podman[96184]: 2025-12-01 09:15:11.704446941 +0000 UTC m=+0.126197586 container attach d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1 (image=quay.io/ceph/ceph:v18, name=clever_snyder, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Dec  1 04:15:12 np0005540741 ceph-mon[75031]: Health check update: 3 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  1 04:15:12 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/3179720452' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Dec  1 04:15:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"} v 0) v1
Dec  1 04:15:12 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/216207811' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Dec  1 04:15:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:15:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:15:12
Dec  1 04:15:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:15:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:15:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['volumes', 'cephfs.cephfs.data', '.mgr', 'vms', 'backups', 'images', 'cephfs.cephfs.meta']
Dec  1 04:15:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v87: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec  1 04:15:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"} v 0) v1
Dec  1 04:15:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:15:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e36 do_prune osdmap full prune enabled
Dec  1 04:15:13 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/216207811' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Dec  1 04:15:13 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:15:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/216207811' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Dec  1 04:15:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:15:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e37 e37: 3 total, 3 up, 3 in
Dec  1 04:15:13 np0005540741 clever_snyder[96199]: enabled application 'cephfs' on pool 'cephfs.cephfs.meta'
Dec  1 04:15:13 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e37: 3 total, 3 up, 3 in
Dec  1 04:15:13 np0005540741 ceph-mgr[75324]: [progress INFO root] update: starting ev 266da851-2f25-4ebc-a75f-f2e1e54bce3c (PG autoscaler increasing pool 2 PGs from 1 to 32)
Dec  1 04:15:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"} v 0) v1
Dec  1 04:15:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:15:13 np0005540741 systemd[1]: libpod-d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1.scope: Deactivated successfully.
Dec  1 04:15:13 np0005540741 podman[96184]: 2025-12-01 09:15:13.263798872 +0000 UTC m=+1.685549517 container died d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1 (image=quay.io/ceph/ceph:v18, name=clever_snyder, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:13 np0005540741 systemd[1]: var-lib-containers-storage-overlay-7f3934b2808e71c83464f81ce06020197d8739b5ad35fa4ac7282c8a8421f7a3-merged.mount: Deactivated successfully.
Dec  1 04:15:13 np0005540741 podman[96184]: 2025-12-01 09:15:13.313389616 +0000 UTC m=+1.735140261 container remove d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1 (image=quay.io/ceph/ceph:v18, name=clever_snyder, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:13 np0005540741 systemd[1]: libpod-conmon-d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1.scope: Deactivated successfully.
Dec  1 04:15:13 np0005540741 python3[96262]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable cephfs.cephfs.data cephfs _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:13 np0005540741 podman[96263]: 2025-12-01 09:15:13.704241799 +0000 UTC m=+0.063753014 container create 7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042 (image=quay.io/ceph/ceph:v18, name=serene_easley, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:15:13 np0005540741 systemd[1]: Started libpod-conmon-7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042.scope.
Dec  1 04:15:13 np0005540741 podman[96263]: 2025-12-01 09:15:13.669776075 +0000 UTC m=+0.029287380 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:13 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:13 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35b9ad00e67a5cce58f6dd547bc22c9c514b4722e618803ab2663e39a68b02e0/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:13 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35b9ad00e67a5cce58f6dd547bc22c9c514b4722e618803ab2663e39a68b02e0/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:13 np0005540741 podman[96263]: 2025-12-01 09:15:13.781097707 +0000 UTC m=+0.140609002 container init 7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042 (image=quay.io/ceph/ceph:v18, name=serene_easley, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:13 np0005540741 podman[96263]: 2025-12-01 09:15:13.790880868 +0000 UTC m=+0.150392113 container start 7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042 (image=quay.io/ceph/ceph:v18, name=serene_easley, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:13 np0005540741 podman[96263]: 2025-12-01 09:15:13.797162057 +0000 UTC m=+0.156673352 container attach 7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042 (image=quay.io/ceph/ceph:v18, name=serene_easley, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  1 04:15:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e37 do_prune osdmap full prune enabled
Dec  1 04:15:14 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:15:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e38 e38: 3 total, 3 up, 3 in
Dec  1 04:15:14 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e38: 3 total, 3 up, 3 in
Dec  1 04:15:14 np0005540741 ceph-mgr[75324]: [progress INFO root] update: starting ev 52c11968-3d0c-400a-9e65-b1c6a27534f8 (PG autoscaler increasing pool 3 PGs from 1 to 32)
Dec  1 04:15:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"} v 0) v1
Dec  1 04:15:14 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:15:14 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/216207811' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Dec  1 04:15:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:15:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:15:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"} v 0) v1
Dec  1 04:15:14 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3525570064' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Dec  1 04:15:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v90: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"} v 0) v1
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"} v 0) v1
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e38 do_prune osdmap full prune enabled
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3525570064' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e39 e39: 3 total, 3 up, 3 in
Dec  1 04:15:15 np0005540741 serene_easley[96278]: enabled application 'cephfs' on pool 'cephfs.cephfs.data'
Dec  1 04:15:15 np0005540741 systemd[1]: libpod-7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042.scope: Deactivated successfully.
Dec  1 04:15:15 np0005540741 podman[96263]: 2025-12-01 09:15:15.346238721 +0000 UTC m=+1.705749936 container died 7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042 (image=quay.io/ceph/ceph:v18, name=serene_easley, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e39: 3 total, 3 up, 3 in
Dec  1 04:15:15 np0005540741 ceph-mgr[75324]: [progress INFO root] update: starting ev 01d93ced-24a8-46f9-aecc-b9550ca5544f (PG autoscaler increasing pool 4 PGs from 1 to 32)
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"} v 0) v1
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:15:15 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 39 pg[3.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=39 pruub=11.517323494s) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active pruub 71.754142761s@ mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:15 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 39 pg[3.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=39 pruub=11.517323494s) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown pruub 71.754142761s@ mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/3525570064' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:15 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:15 np0005540741 systemd[1]: var-lib-containers-storage-overlay-35b9ad00e67a5cce58f6dd547bc22c9c514b4722e618803ab2663e39a68b02e0-merged.mount: Deactivated successfully.
Dec  1 04:15:15 np0005540741 podman[96263]: 2025-12-01 09:15:15.979755825 +0000 UTC m=+2.339267030 container remove 7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042 (image=quay.io/ceph/ceph:v18, name=serene_easley, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec  1 04:15:16 np0005540741 systemd[1]: libpod-conmon-7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042.scope: Deactivated successfully.
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 39 pg[2.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=39 pruub=8.982954979s) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active pruub 61.020610809s@ mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [2], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 39 pg[2.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=39 pruub=8.982954979s) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown pruub 61.020610809s@ mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e39 do_prune osdmap full prune enabled
Dec  1 04:15:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:15:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e40 e40: 3 total, 3 up, 3 in
Dec  1 04:15:16 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e40: 3 total, 3 up, 3 in
Dec  1 04:15:16 np0005540741 ceph-mgr[75324]: [progress INFO root] update: starting ev 5c9d7e9c-1f04-4651-aec0-3bbe4e7daa02 (PG autoscaler increasing pool 5 PGs from 1 to 32)
Dec  1 04:15:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"} v 0) v1
Dec  1 04:15:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1c( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1d( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.a( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1f( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1e( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1b( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1f( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1c( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.b( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1d( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.9( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.8( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1e( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.a( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.9( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.8( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.5( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.7( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.6( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.5( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.4( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.3( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.6( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.7( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.3( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.c( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.d( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.e( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.2( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.f( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.10( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.11( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.12( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.13( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.15( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.14( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.16( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.17( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.18( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.19( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.4( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.2( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.b( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.c( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.d( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.e( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.f( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.10( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.11( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.12( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.13( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.14( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.15( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.16( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.18( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.17( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.19( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1a( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1a( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1b( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1e( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.0( empty local-lis/les=39/40 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.e( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.12( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.10( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.14( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.0( empty local-lis/les=39/40 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.10( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.14( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.19( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1 scrub starts
Dec  1 04:15:16 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1 scrub ok
Dec  1 04:15:16 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:15:16 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/3525570064' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Dec  1 04:15:16 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:15:16 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:15:16 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:15:16 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:15:16 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:15:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v93: 69 pgs: 62 unknown, 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"} v 0) v1
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"} v 0) v1
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e40 do_prune osdmap full prune enabled
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e41 e41: 3 total, 3 up, 3 in
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e41: 3 total, 3 up, 3 in
Dec  1 04:15:17 np0005540741 ceph-mgr[75324]: [progress INFO root] update: starting ev 4e48849a-3329-4b0d-93ed-40d2131a9cce (PG autoscaler increasing pool 6 PGs from 1 to 32)
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"} v 0) v1
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:15:17 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 41 pg[5.0( empty local-lis/les=27/28 n=0 ec=27/27 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=13.927580833s) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active pruub 67.123153687s@ mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [2], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:17 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 41 pg[5.0( empty local-lis/les=27/28 n=0 ec=27/27 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=13.927580833s) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown pruub 67.123153687s@ mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:17 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.2 scrub starts
Dec  1 04:15:17 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.2 scrub ok
Dec  1 04:15:17 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Dec  1 04:15:17 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1 scrub ok
Dec  1 04:15:17 np0005540741 python3[96392]: ansible-ansible.legacy.stat Invoked with path=/tmp/ceph_mds.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : Cluster is now healthy
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:15:17 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 41 pg[4.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=41 pruub=11.467036247s) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active pruub 78.851821899s@ mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:17 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 41 pg[4.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=41 pruub=11.467036247s) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown pruub 78.851821899s@ mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:15:18 np0005540741 python3[96463]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580517.4108396-36519-113963609720530/source dest=/tmp/ceph_mds.yml mode=0644 force=True follow=False _original_basename=ceph_mds.yml.j2 checksum=e359e26d9e42bc107a0de03375144cf8590b6f68 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:15:18 np0005540741 ceph-mgr[75324]: [progress WARNING root] Starting Global Recovery Event,93 pgs not in active + clean state
Dec  1 04:15:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e41 do_prune osdmap full prune enabled
Dec  1 04:15:18 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:15:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e42 e42: 3 total, 3 up, 3 in
Dec  1 04:15:18 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e42: 3 total, 3 up, 3 in
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1e( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-mgr[75324]: [progress INFO root] update: starting ev 72549a5c-bd64-44e7-a176-0b67e4907218 (PG autoscaler increasing pool 7 PGs from 1 to 32)
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1f( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1d( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.7( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.8( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1c( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.b( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.6( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1b( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.a( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1a( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.5( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.9( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.4( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.19( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.3( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.2( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.d( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.e( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.f( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-mgr[75324]: [progress INFO root] complete: finished ev 266da851-2f25-4ebc-a75f-f2e1e54bce3c (PG autoscaler increasing pool 2 PGs from 1 to 32)
Dec  1 04:15:18 np0005540741 ceph-mgr[75324]: [progress INFO root] Completed event 266da851-2f25-4ebc-a75f-f2e1e54bce3c (PG autoscaler increasing pool 2 PGs from 1 to 32) in 5 seconds
Dec  1 04:15:18 np0005540741 ceph-mgr[75324]: [progress INFO root] complete: finished ev 52c11968-3d0c-400a-9e65-b1c6a27534f8 (PG autoscaler increasing pool 3 PGs from 1 to 32)
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.c( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-mgr[75324]: [progress INFO root] Completed event 52c11968-3d0c-400a-9e65-b1c6a27534f8 (PG autoscaler increasing pool 3 PGs from 1 to 32) in 4 seconds
Dec  1 04:15:18 np0005540741 ceph-mgr[75324]: [progress INFO root] complete: finished ev 01d93ced-24a8-46f9-aecc-b9550ca5544f (PG autoscaler increasing pool 4 PGs from 1 to 32)
Dec  1 04:15:18 np0005540741 ceph-mgr[75324]: [progress INFO root] Completed event 01d93ced-24a8-46f9-aecc-b9550ca5544f (PG autoscaler increasing pool 4 PGs from 1 to 32) in 3 seconds
Dec  1 04:15:18 np0005540741 ceph-mgr[75324]: [progress INFO root] complete: finished ev 5c9d7e9c-1f04-4651-aec0-3bbe4e7daa02 (PG autoscaler increasing pool 5 PGs from 1 to 32)
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.10( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-mgr[75324]: [progress INFO root] Completed event 5c9d7e9c-1f04-4651-aec0-3bbe4e7daa02 (PG autoscaler increasing pool 5 PGs from 1 to 32) in 2 seconds
Dec  1 04:15:18 np0005540741 ceph-mgr[75324]: [progress INFO root] complete: finished ev 4e48849a-3329-4b0d-93ed-40d2131a9cce (PG autoscaler increasing pool 6 PGs from 1 to 32)
Dec  1 04:15:18 np0005540741 ceph-mgr[75324]: [progress INFO root] Completed event 4e48849a-3329-4b0d-93ed-40d2131a9cce (PG autoscaler increasing pool 6 PGs from 1 to 32) in 1 seconds
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.11( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-mgr[75324]: [progress INFO root] complete: finished ev 72549a5c-bd64-44e7-a176-0b67e4907218 (PG autoscaler increasing pool 7 PGs from 1 to 32)
Dec  1 04:15:18 np0005540741 ceph-mgr[75324]: [progress INFO root] Completed event 72549a5c-bd64-44e7-a176-0b67e4907218 (PG autoscaler increasing pool 7 PGs from 1 to 32) in 0 seconds
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.13( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.12( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.15( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.17( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.16( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.18( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.14( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1c( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1d( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1f( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.10( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.11( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.12( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.13( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.14( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.15( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.16( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.17( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.9( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.8( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.a( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.b( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.7( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.6( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.4( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.3( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.5( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.2( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.d( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.6( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.19( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.3( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.0( empty local-lis/les=41/42 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.15( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.16( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.17( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.c( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.e( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.f( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1b( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1a( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.19( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1e( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.18( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.10( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.17( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.8( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.0( empty local-lis/les=41/42 n=0 ec=27/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.b( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.6( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1b( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:18 np0005540741 python3[96513]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /tmp/ceph_mds.yml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   fs volume create cephfs '--placement=compute-0 '#012 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:18 np0005540741 podman[96514]: 2025-12-01 09:15:18.577193515 +0000 UTC m=+0.056051009 container create cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34 (image=quay.io/ceph/ceph:v18, name=nice_sinoussi, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True)
Dec  1 04:15:18 np0005540741 systemd[1]: Started libpod-conmon-cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34.scope.
Dec  1 04:15:18 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78e3c9e348c0ac71434cdf9e5c89072567ddb145ceefc8a34d331a64629e89d2/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78e3c9e348c0ac71434cdf9e5c89072567ddb145ceefc8a34d331a64629e89d2/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78e3c9e348c0ac71434cdf9e5c89072567ddb145ceefc8a34d331a64629e89d2/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:18 np0005540741 podman[96514]: 2025-12-01 09:15:18.553986569 +0000 UTC m=+0.032844153 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:18 np0005540741 podman[96514]: 2025-12-01 09:15:18.65202323 +0000 UTC m=+0.130880744 container init cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34 (image=quay.io/ceph/ceph:v18, name=nice_sinoussi, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec  1 04:15:18 np0005540741 podman[96514]: 2025-12-01 09:15:18.658851027 +0000 UTC m=+0.137708521 container start cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34 (image=quay.io/ceph/ceph:v18, name=nice_sinoussi, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:15:18 np0005540741 podman[96514]: 2025-12-01 09:15:18.662076209 +0000 UTC m=+0.140933713 container attach cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34 (image=quay.io/ceph/ceph:v18, name=nice_sinoussi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:18 np0005540741 ceph-mon[75031]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Dec  1 04:15:18 np0005540741 ceph-mon[75031]: Cluster is now healthy
Dec  1 04:15:18 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:15:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v96: 131 pgs: 93 unknown, 38 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"} v 0) v1
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"} v 0) v1
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:19 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14242 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "compute-0 ", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:15:19 np0005540741 ceph-mgr[75324]: [volumes INFO volumes.module] Starting _cmd_fs_volume_create(name:cephfs, placement:compute-0 , prefix:fs volume create, target:['mon-mgr', '']) < ""
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"} v 0) v1
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"} v 0) v1
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"} v 0) v1
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e42 do_prune osdmap full prune enabled
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: log_channel(cluster) log [WRN] : Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Dec  1 04:15:19 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0[75027]: 2025-12-01T09:15:19.275+0000 7ff93f5e2640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).mds e2 new map
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).mds e2 print_map#012e2#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:15:19.277019+0000#012modified#0112025-12-01T09:15:19.277097+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e43 e43: 3 total, 3 up, 3 in
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e43: 3 total, 3 up, 3 in
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : fsmap cephfs:0
Dec  1 04:15:19 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 43 pg[6.0( empty local-lis/les=29/30 n=0 ec=29/29 lis/c=29/29 les/c/f=30/30/0 sis=43 pruub=14.023229599s) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active pruub 82.905815125s@ mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:19 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Saving service mds.cephfs spec with placement compute-0
Dec  1 04:15:19 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service mds.cephfs spec with placement compute-0
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mds.cephfs}] v 0) v1
Dec  1 04:15:19 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 43 pg[6.0( empty local-lis/les=29/30 n=0 ec=29/29 lis/c=29/29 les/c/f=30/30/0 sis=43 pruub=14.023229599s) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown pruub 82.905815125s@ mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:19 np0005540741 ceph-mgr[75324]: [volumes INFO volumes.module] Finishing _cmd_fs_volume_create(name:cephfs, placement:compute-0 , prefix:fs volume create, target:['mon-mgr', '']) < ""
Dec  1 04:15:19 np0005540741 systemd[1]: libpod-cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34.scope: Deactivated successfully.
Dec  1 04:15:19 np0005540741 conmon[96529]: conmon cabf666166aa2bf023f2 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34.scope/container/memory.events
Dec  1 04:15:19 np0005540741 podman[96514]: 2025-12-01 09:15:19.327393431 +0000 UTC m=+0.806250925 container died cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34 (image=quay.io/ceph/ceph:v18, name=nice_sinoussi, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:15:19 np0005540741 systemd[1]: var-lib-containers-storage-overlay-78e3c9e348c0ac71434cdf9e5c89072567ddb145ceefc8a34d331a64629e89d2-merged.mount: Deactivated successfully.
Dec  1 04:15:19 np0005540741 systemd[76658]: Starting Mark boot as successful...
Dec  1 04:15:19 np0005540741 systemd[76658]: Finished Mark boot as successful.
Dec  1 04:15:19 np0005540741 podman[96514]: 2025-12-01 09:15:19.374451594 +0000 UTC m=+0.853309088 container remove cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34 (image=quay.io/ceph/ceph:v18, name=nice_sinoussi, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec  1 04:15:19 np0005540741 systemd[1]: libpod-conmon-cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34.scope: Deactivated successfully.
Dec  1 04:15:19 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.2 deep-scrub starts
Dec  1 04:15:19 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.2 deep-scrub ok
Dec  1 04:15:19 np0005540741 python3[96692]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /tmp/ceph_mds.yml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch apply --in-file /home/ceph_spec.yaml _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:15:19 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:19 np0005540741 podman[96702]: 2025-12-01 09:15:19.777158293 +0000 UTC m=+0.041515529 container create f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64 (image=quay.io/ceph/ceph:v18, name=epic_kepler, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  1 04:15:19 np0005540741 systemd[1]: Started libpod-conmon-f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64.scope.
Dec  1 04:15:19 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:19 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32cca5f0e3b422a5e7501e00004c7f634a5cb05b3e828aa9e868943f0855073a/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:19 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32cca5f0e3b422a5e7501e00004c7f634a5cb05b3e828aa9e868943f0855073a/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:19 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32cca5f0e3b422a5e7501e00004c7f634a5cb05b3e828aa9e868943f0855073a/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:19 np0005540741 podman[96702]: 2025-12-01 09:15:19.854034042 +0000 UTC m=+0.118391298 container init f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64 (image=quay.io/ceph/ceph:v18, name=epic_kepler, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec  1 04:15:19 np0005540741 podman[96702]: 2025-12-01 09:15:19.760443872 +0000 UTC m=+0.024801148 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:19 np0005540741 podman[96702]: 2025-12-01 09:15:19.863882114 +0000 UTC m=+0.128239360 container start f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64 (image=quay.io/ceph/ceph:v18, name=epic_kepler, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  1 04:15:19 np0005540741 podman[96702]: 2025-12-01 09:15:19.868509831 +0000 UTC m=+0.132867097 container attach f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64 (image=quay.io/ceph/ceph:v18, name=epic_kepler, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  1 04:15:20 np0005540741 podman[96784]: 2025-12-01 09:15:20.145957895 +0000 UTC m=+0.060561462 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec  1 04:15:20 np0005540741 podman[96784]: 2025-12-01 09:15:20.264123285 +0000 UTC m=+0.178726892 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e43 do_prune osdmap full prune enabled
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e44 e44: 3 total, 3 up, 3 in
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e44: 3 total, 3 up, 3 in
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1a( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.14( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.17( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.16( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.11( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.10( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.13( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.12( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.d( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.c( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.f( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.e( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.2( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.3( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1b( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.6( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.7( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.8( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.19( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.4( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.9( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.b( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.15( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.5( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.a( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1e( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1f( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1c( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1d( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.18( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1a( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.16( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.10( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.12( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.0( empty local-lis/les=43/44 n=0 ec=29/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.3( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.7( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.19( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.9( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.5( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.a( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.18( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:20 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14244 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:15:20 np0005540741 ceph-mgr[75324]: [cephadm INFO root] Saving service mds.cephfs spec with placement compute-0
Dec  1 04:15:20 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service mds.cephfs spec with placement compute-0
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mds.cephfs}] v 0) v1
Dec  1 04:15:20 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.3 scrub starts
Dec  1 04:15:20 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.3 scrub ok
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:20 np0005540741 epic_kepler[96734]: Scheduled mds.cephfs update...
Dec  1 04:15:20 np0005540741 systemd[1]: libpod-f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64.scope: Deactivated successfully.
Dec  1 04:15:20 np0005540741 podman[96702]: 2025-12-01 09:15:20.760243558 +0000 UTC m=+1.024600804 container died f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64 (image=quay.io/ceph/ceph:v18, name=epic_kepler, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: Saving service mds.cephfs spec with placement compute-0
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:20 np0005540741 systemd[1]: var-lib-containers-storage-overlay-32cca5f0e3b422a5e7501e00004c7f634a5cb05b3e828aa9e868943f0855073a-merged.mount: Deactivated successfully.
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:20 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev cb41a296-48fd-4efe-b232-6bd37d0d4845 does not exist
Dec  1 04:15:20 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 789f6ee6-c2dc-4926-8b19-6aa0ec081238 does not exist
Dec  1 04:15:20 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev dd9c7641-4a36-4186-88e3-e2c28d7194a7 does not exist
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:15:20 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:15:20 np0005540741 podman[96702]: 2025-12-01 09:15:20.960612736 +0000 UTC m=+1.224969992 container remove f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64 (image=quay.io/ceph/ceph:v18, name=epic_kepler, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  1 04:15:20 np0005540741 systemd[1]: libpod-conmon-f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64.scope: Deactivated successfully.
Dec  1 04:15:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v99: 193 pgs: 1 peering, 62 unknown, 130 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 43 pg[7.0( empty local-lis/les=31/32 n=0 ec=31/31 lis/c=31/31 les/c/f=32/32/0 sis=43 pruub=13.941397667s) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active pruub 79.873344421s@ mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.0( empty local-lis/les=31/32 n=0 ec=31/31 lis/c=31/31 les/c/f=32/32/0 sis=43 pruub=13.941397667s) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown pruub 79.873344421s@ mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.11( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.12( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.17( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.18( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.19( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.1a( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.4( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.3( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.1( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.2( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.5( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.6( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.9( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.a( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.7( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.8( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.d( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.e( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.b( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.c( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.f( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.10( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.13( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.14( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.15( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.16( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.1b( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.1f( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.1c( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.1d( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.1e( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:21 np0005540741 podman[97160]: 2025-12-01 09:15:21.646910533 +0000 UTC m=+0.073455512 container create 0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_mcclintock, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:21 np0005540741 python3[97147]: ansible-ansible.legacy.stat Invoked with path=/etc/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:15:21 np0005540741 systemd[1]: Started libpod-conmon-0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50.scope.
Dec  1 04:15:21 np0005540741 podman[97160]: 2025-12-01 09:15:21.61751335 +0000 UTC m=+0.044058379 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:21 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:21 np0005540741 podman[97160]: 2025-12-01 09:15:21.757596935 +0000 UTC m=+0.184141884 container init 0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_mcclintock, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec  1 04:15:21 np0005540741 podman[97160]: 2025-12-01 09:15:21.768933634 +0000 UTC m=+0.195478573 container start 0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_mcclintock, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:21 np0005540741 podman[97160]: 2025-12-01 09:15:21.772973332 +0000 UTC m=+0.199518381 container attach 0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_mcclintock, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  1 04:15:21 np0005540741 focused_mcclintock[97181]: 167 167
Dec  1 04:15:21 np0005540741 systemd[1]: libpod-0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50.scope: Deactivated successfully.
Dec  1 04:15:21 np0005540741 podman[97160]: 2025-12-01 09:15:21.774978536 +0000 UTC m=+0.201523475 container died 0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_mcclintock, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec  1 04:15:21 np0005540741 systemd[1]: var-lib-containers-storage-overlay-29fcaa556f8c12a0506b8bbb81bc7f54c5f9e526643734d5c91d358af0b975b4-merged.mount: Deactivated successfully.
Dec  1 04:15:21 np0005540741 podman[97160]: 2025-12-01 09:15:21.81543008 +0000 UTC m=+0.241975029 container remove 0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_mcclintock, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:15:21 np0005540741 systemd[1]: libpod-conmon-0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50.scope: Deactivated successfully.
Dec  1 04:15:21 np0005540741 ceph-mon[75031]: Saving service mds.cephfs spec with placement compute-0
Dec  1 04:15:21 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:21 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:21 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:15:21 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:21 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:15:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e44 do_prune osdmap full prune enabled
Dec  1 04:15:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e45 e45: 3 total, 3 up, 3 in
Dec  1 04:15:21 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e45: 3 total, 3 up, 3 in
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.1e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.12( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.10( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.1d( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.17( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.16( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.14( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.7( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.d( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.19( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.0( empty local-lis/les=43/45 n=0 ec=31/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:21 np0005540741 podman[97274]: 2025-12-01 09:15:21.991115495 +0000 UTC m=+0.057898119 container create 68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_yalow, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2)
Dec  1 04:15:22 np0005540741 systemd[1]: Started libpod-conmon-68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d.scope.
Dec  1 04:15:22 np0005540741 podman[97274]: 2025-12-01 09:15:21.957110815 +0000 UTC m=+0.023893489 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:22 np0005540741 python3[97268]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580521.346623-36549-116230047858529/source dest=/etc/ceph/ceph.client.openstack.keyring mode=0644 force=True owner=167 group=167 follow=False _original_basename=ceph_key.j2 checksum=30c595aa84bea916cfc9cc906a8788f27659122a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:15:22 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75e4d34a1e513df5001d46cffe4aa4a368565cafde0d38f59cc8b838a67b07f7/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75e4d34a1e513df5001d46cffe4aa4a368565cafde0d38f59cc8b838a67b07f7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75e4d34a1e513df5001d46cffe4aa4a368565cafde0d38f59cc8b838a67b07f7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75e4d34a1e513df5001d46cffe4aa4a368565cafde0d38f59cc8b838a67b07f7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75e4d34a1e513df5001d46cffe4aa4a368565cafde0d38f59cc8b838a67b07f7/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:22 np0005540741 podman[97274]: 2025-12-01 09:15:22.087777632 +0000 UTC m=+0.154560286 container init 68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_yalow, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True)
Dec  1 04:15:22 np0005540741 podman[97274]: 2025-12-01 09:15:22.097697977 +0000 UTC m=+0.164480601 container start 68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_yalow, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:15:22 np0005540741 podman[97274]: 2025-12-01 09:15:22.101526158 +0000 UTC m=+0.168308802 container attach 68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_yalow, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec  1 04:15:22 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Dec  1 04:15:22 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Dec  1 04:15:22 np0005540741 python3[97344]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   auth import -i /etc/ceph/ceph.client.openstack.keyring _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:22 np0005540741 podman[97345]: 2025-12-01 09:15:22.61843059 +0000 UTC m=+0.026606595 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:22 np0005540741 podman[97345]: 2025-12-01 09:15:22.821252176 +0000 UTC m=+0.229428181 container create dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b (image=quay.io/ceph/ceph:v18, name=focused_bell, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default)
Dec  1 04:15:22 np0005540741 systemd[1]: Started libpod-conmon-dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b.scope.
Dec  1 04:15:22 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d124a8e1798da8aca34e5353f3f6e9595804cce06e967f4a5109e74aaa046729/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d124a8e1798da8aca34e5353f3f6e9595804cce06e967f4a5109e74aaa046729/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:22 np0005540741 podman[97345]: 2025-12-01 09:15:22.940925914 +0000 UTC m=+0.349101939 container init dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b (image=quay.io/ceph/ceph:v18, name=focused_bell, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec  1 04:15:22 np0005540741 podman[97345]: 2025-12-01 09:15:22.948772073 +0000 UTC m=+0.356948078 container start dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b (image=quay.io/ceph/ceph:v18, name=focused_bell, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2)
Dec  1 04:15:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:15:22 np0005540741 podman[97345]: 2025-12-01 09:15:22.952498741 +0000 UTC m=+0.360674856 container attach dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b (image=quay.io/ceph/ceph:v18, name=focused_bell, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:15:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v101: 193 pgs: 1 peering, 31 unknown, 161 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:23 np0005540741 ceph-mgr[75324]: [progress INFO root] Writing back 9 completed events
Dec  1 04:15:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) v1
Dec  1 04:15:23 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:23 np0005540741 determined_yalow[97290]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:15:23 np0005540741 determined_yalow[97290]: --> relative data size: 1.0
Dec  1 04:15:23 np0005540741 determined_yalow[97290]: --> All data devices are unavailable
Dec  1 04:15:23 np0005540741 systemd[1]: libpod-68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d.scope: Deactivated successfully.
Dec  1 04:15:23 np0005540741 podman[97274]: 2025-12-01 09:15:23.242128912 +0000 UTC m=+1.308911536 container died 68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_yalow, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec  1 04:15:23 np0005540741 systemd[1]: libpod-68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d.scope: Consumed 1.053s CPU time.
Dec  1 04:15:23 np0005540741 systemd[1]: var-lib-containers-storage-overlay-75e4d34a1e513df5001d46cffe4aa4a368565cafde0d38f59cc8b838a67b07f7-merged.mount: Deactivated successfully.
Dec  1 04:15:23 np0005540741 podman[97274]: 2025-12-01 09:15:23.341218456 +0000 UTC m=+1.408001080 container remove 68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_yalow, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec  1 04:15:23 np0005540741 systemd[1]: libpod-conmon-68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d.scope: Deactivated successfully.
Dec  1 04:15:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth import"} v 0) v1
Dec  1 04:15:23 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2988375873' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Dec  1 04:15:23 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2988375873' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Dec  1 04:15:23 np0005540741 systemd[1]: libpod-dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b.scope: Deactivated successfully.
Dec  1 04:15:23 np0005540741 podman[97345]: 2025-12-01 09:15:23.623243525 +0000 UTC m=+1.031419540 container died dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b (image=quay.io/ceph/ceph:v18, name=focused_bell, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:23 np0005540741 systemd[1]: var-lib-containers-storage-overlay-d124a8e1798da8aca34e5353f3f6e9595804cce06e967f4a5109e74aaa046729-merged.mount: Deactivated successfully.
Dec  1 04:15:23 np0005540741 podman[97345]: 2025-12-01 09:15:23.671143075 +0000 UTC m=+1.079319080 container remove dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b (image=quay.io/ceph/ceph:v18, name=focused_bell, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:15:23 np0005540741 systemd[1]: libpod-conmon-dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b.scope: Deactivated successfully.
Dec  1 04:15:23 np0005540741 podman[97576]: 2025-12-01 09:15:23.954732934 +0000 UTC m=+0.038722020 container create 6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_hermann, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:15:23 np0005540741 systemd[1]: Started libpod-conmon-6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43.scope.
Dec  1 04:15:24 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:24 np0005540741 podman[97576]: 2025-12-01 09:15:24.026883633 +0000 UTC m=+0.110872719 container init 6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_hermann, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec  1 04:15:24 np0005540741 podman[97576]: 2025-12-01 09:15:24.032645906 +0000 UTC m=+0.116634992 container start 6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_hermann, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:15:24 np0005540741 podman[97576]: 2025-12-01 09:15:23.937938591 +0000 UTC m=+0.021927697 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:24 np0005540741 podman[97576]: 2025-12-01 09:15:24.035872699 +0000 UTC m=+0.119861815 container attach 6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_hermann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:15:24 np0005540741 elegant_hermann[97592]: 167 167
Dec  1 04:15:24 np0005540741 systemd[1]: libpod-6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43.scope: Deactivated successfully.
Dec  1 04:15:24 np0005540741 podman[97576]: 2025-12-01 09:15:24.037279813 +0000 UTC m=+0.121268899 container died 6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_hermann, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec  1 04:15:24 np0005540741 systemd[1]: var-lib-containers-storage-overlay-a59471fc1f4b9b9e7a4c6c4c40b4394a030bf9f5ed40c7d8ed24d26263abcf46-merged.mount: Deactivated successfully.
Dec  1 04:15:24 np0005540741 podman[97576]: 2025-12-01 09:15:24.073947837 +0000 UTC m=+0.157936923 container remove 6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_hermann, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec  1 04:15:24 np0005540741 systemd[1]: libpod-conmon-6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43.scope: Deactivated successfully.
Dec  1 04:15:24 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:24 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/2988375873' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Dec  1 04:15:24 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/2988375873' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Dec  1 04:15:24 np0005540741 podman[97638]: 2025-12-01 09:15:24.218891906 +0000 UTC m=+0.044038098 container create 6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_knuth, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:15:24 np0005540741 systemd[1]: Started libpod-conmon-6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599.scope.
Dec  1 04:15:24 np0005540741 podman[97638]: 2025-12-01 09:15:24.203463286 +0000 UTC m=+0.028609498 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:24 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35e4821a2616c8bc260468275709e7d0c2311c8703514c6d389c30af2f0f55a6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35e4821a2616c8bc260468275709e7d0c2311c8703514c6d389c30af2f0f55a6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35e4821a2616c8bc260468275709e7d0c2311c8703514c6d389c30af2f0f55a6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35e4821a2616c8bc260468275709e7d0c2311c8703514c6d389c30af2f0f55a6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:24 np0005540741 python3[97647]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   status --format json | jq .monmap.num_mons _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:24 np0005540741 podman[97638]: 2025-12-01 09:15:24.340695381 +0000 UTC m=+0.165841603 container init 6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_knuth, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec  1 04:15:24 np0005540741 podman[97638]: 2025-12-01 09:15:24.351115652 +0000 UTC m=+0.176261844 container start 6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_knuth, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:15:24 np0005540741 podman[97638]: 2025-12-01 09:15:24.372006345 +0000 UTC m=+0.197152557 container attach 6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_knuth, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec  1 04:15:24 np0005540741 podman[97662]: 2025-12-01 09:15:24.39833522 +0000 UTC m=+0.046425114 container create 84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a (image=quay.io/ceph/ceph:v18, name=intelligent_kepler, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Dec  1 04:15:24 np0005540741 systemd[1]: Started libpod-conmon-84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a.scope.
Dec  1 04:15:24 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55f1a9227a2d40fdbb98e5afc7638945fda74a55235b768026b37860b935e029/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55f1a9227a2d40fdbb98e5afc7638945fda74a55235b768026b37860b935e029/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:24 np0005540741 podman[97662]: 2025-12-01 09:15:24.471680357 +0000 UTC m=+0.119770281 container init 84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a (image=quay.io/ceph/ceph:v18, name=intelligent_kepler, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:15:24 np0005540741 podman[97662]: 2025-12-01 09:15:24.379195183 +0000 UTC m=+0.027285097 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:24 np0005540741 podman[97662]: 2025-12-01 09:15:24.477967597 +0000 UTC m=+0.126057491 container start 84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a (image=quay.io/ceph/ceph:v18, name=intelligent_kepler, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 04:15:24 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.2 deep-scrub starts
Dec  1 04:15:24 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.2 deep-scrub ok
Dec  1 04:15:24 np0005540741 podman[97662]: 2025-12-01 09:15:24.65988782 +0000 UTC m=+0.307977734 container attach 84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a (image=quay.io/ceph/ceph:v18, name=intelligent_kepler, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:15:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v102: 193 pgs: 1 peering, 31 unknown, 161 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]: {
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:    "0": [
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:        {
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "devices": [
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "/dev/loop3"
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            ],
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "lv_name": "ceph_lv0",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "lv_size": "21470642176",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "name": "ceph_lv0",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "tags": {
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.cluster_name": "ceph",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.crush_device_class": "",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.encrypted": "0",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.osd_id": "0",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.type": "block",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.vdo": "0"
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            },
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "type": "block",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "vg_name": "ceph_vg0"
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:        }
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:    ],
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:    "1": [
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:        {
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "devices": [
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "/dev/loop4"
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            ],
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "lv_name": "ceph_lv1",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "lv_size": "21470642176",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "name": "ceph_lv1",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "tags": {
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.cluster_name": "ceph",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.crush_device_class": "",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.encrypted": "0",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.osd_id": "1",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.type": "block",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.vdo": "0"
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            },
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "type": "block",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "vg_name": "ceph_vg1"
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:        }
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:    ],
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:    "2": [
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:        {
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "devices": [
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "/dev/loop5"
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            ],
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "lv_name": "ceph_lv2",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "lv_size": "21470642176",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "name": "ceph_lv2",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "tags": {
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.cluster_name": "ceph",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.crush_device_class": "",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.encrypted": "0",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.osd_id": "2",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.type": "block",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:                "ceph.vdo": "0"
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            },
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "type": "block",
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:            "vg_name": "ceph_vg2"
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:        }
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]:    ]
Dec  1 04:15:25 np0005540741 interesting_knuth[97658]: }
Dec  1 04:15:25 np0005540741 systemd[1]: libpod-6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599.scope: Deactivated successfully.
Dec  1 04:15:25 np0005540741 podman[97638]: 2025-12-01 09:15:25.195685091 +0000 UTC m=+1.020831283 container died 6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_knuth, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:15:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json"} v 0) v1
Dec  1 04:15:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/480516295' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec  1 04:15:25 np0005540741 intelligent_kepler[97680]: 
Dec  1 04:15:25 np0005540741 intelligent_kepler[97680]: {"fsid":"5620a9fb-e540-5250-a0e8-7aaad5347e3b","health":{"status":"HEALTH_ERR","checks":{"MDS_ALL_DOWN":{"severity":"HEALTH_ERR","summary":{"message":"1 filesystem is offline","count":1},"muted":false},"MDS_UP_LESS_THAN_MAX":{"severity":"HEALTH_WARN","summary":{"message":"1 filesystem is online with fewer MDS than max_mds","count":1},"muted":false}},"mutes":[]},"election_epoch":5,"quorum":[0],"quorum_names":["compute-0"],"quorum_age":182,"monmap":{"epoch":1,"min_mon_release_name":"reef","num_mons":1},"osdmap":{"epoch":45,"num_osds":3,"num_up_osds":3,"osd_up_since":1764580474,"num_in_osds":3,"osd_in_since":1764580437,"num_remapped_pgs":0},"pgmap":{"pgs_by_state":[{"state_name":"active+clean","count":161},{"state_name":"unknown","count":31},{"state_name":"peering","count":1}],"num_pgs":193,"num_pools":7,"num_objects":2,"data_bytes":459280,"bytes_used":84180992,"bytes_avail":64327745536,"bytes_total":64411926528,"unknown_pgs_ratio":0.1606217622756958,"inactive_pgs_ratio":0.005181347019970417},"fsmap":{"epoch":2,"id":1,"up":0,"in":0,"max":1,"by_rank":[],"up:standby":0},"mgrmap":{"available":true,"num_standbys":0,"modules":["cephadm","iostat","nfs","restful"],"services":{}},"servicemap":{"epoch":3,"modified":"2025-12-01T09:15:21.032921+0000","services":{"osd":{"daemons":{"summary":"","1":{"start_epoch":0,"start_stamp":"0.000000","gid":0,"addr":"(unrecognized address family 0)/0","metadata":{},"task_status":{}},"2":{"start_epoch":0,"start_stamp":"0.000000","gid":0,"addr":"(unrecognized address family 0)/0","metadata":{},"task_status":{}}}}}},"progress_events":{"6a1eadee-2fd7-4097-9f7f-4e2c6af1e403":{"message":"Global Recovery Event (0s)\n      [............................] ","progress":0,"add_to_ceph_s":true}}}
Dec  1 04:15:25 np0005540741 systemd[1]: var-lib-containers-storage-overlay-35e4821a2616c8bc260468275709e7d0c2311c8703514c6d389c30af2f0f55a6-merged.mount: Deactivated successfully.
Dec  1 04:15:25 np0005540741 systemd[1]: libpod-84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a.scope: Deactivated successfully.
Dec  1 04:15:25 np0005540741 podman[97662]: 2025-12-01 09:15:25.261158499 +0000 UTC m=+0.909248393 container died 84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a (image=quay.io/ceph/ceph:v18, name=intelligent_kepler, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:15:25 np0005540741 podman[97638]: 2025-12-01 09:15:25.357463104 +0000 UTC m=+1.182609336 container remove 6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_knuth, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec  1 04:15:25 np0005540741 systemd[1]: libpod-conmon-6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599.scope: Deactivated successfully.
Dec  1 04:15:25 np0005540741 systemd[1]: var-lib-containers-storage-overlay-55f1a9227a2d40fdbb98e5afc7638945fda74a55235b768026b37860b935e029-merged.mount: Deactivated successfully.
Dec  1 04:15:25 np0005540741 podman[97662]: 2025-12-01 09:15:25.420401061 +0000 UTC m=+1.068490955 container remove 84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a (image=quay.io/ceph/ceph:v18, name=intelligent_kepler, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Dec  1 04:15:25 np0005540741 systemd[1]: libpod-conmon-84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a.scope: Deactivated successfully.
Dec  1 04:15:25 np0005540741 python3[97832]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   mon dump --format json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:25 np0005540741 podman[97860]: 2025-12-01 09:15:25.80989538 +0000 UTC m=+0.046677802 container create 3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109 (image=quay.io/ceph/ceph:v18, name=relaxed_bohr, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:25 np0005540741 systemd[1]: Started libpod-conmon-3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109.scope.
Dec  1 04:15:25 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86c0ac1fae0c8f4087fdf12a1783d13fdbbedbc5ec3da1c971df73ea3cec59d3/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86c0ac1fae0c8f4087fdf12a1783d13fdbbedbc5ec3da1c971df73ea3cec59d3/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:25 np0005540741 podman[97860]: 2025-12-01 09:15:25.791359672 +0000 UTC m=+0.028142134 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:25 np0005540741 podman[97860]: 2025-12-01 09:15:25.888134723 +0000 UTC m=+0.124917165 container init 3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109 (image=quay.io/ceph/ceph:v18, name=relaxed_bohr, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:25 np0005540741 podman[97860]: 2025-12-01 09:15:25.894847426 +0000 UTC m=+0.131629848 container start 3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109 (image=quay.io/ceph/ceph:v18, name=relaxed_bohr, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:15:25 np0005540741 podman[97860]: 2025-12-01 09:15:25.898465951 +0000 UTC m=+0.135248393 container attach 3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109 (image=quay.io/ceph/ceph:v18, name=relaxed_bohr, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:15:26 np0005540741 podman[97914]: 2025-12-01 09:15:26.016377872 +0000 UTC m=+0.039784863 container create d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_banach, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  1 04:15:26 np0005540741 systemd[1]: Started libpod-conmon-d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9.scope.
Dec  1 04:15:26 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:26 np0005540741 podman[97914]: 2025-12-01 09:15:26.079429883 +0000 UTC m=+0.102836884 container init d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_banach, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec  1 04:15:26 np0005540741 podman[97914]: 2025-12-01 09:15:26.086379714 +0000 UTC m=+0.109786705 container start d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_banach, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  1 04:15:26 np0005540741 adoring_banach[97930]: 167 167
Dec  1 04:15:26 np0005540741 systemd[1]: libpod-d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9.scope: Deactivated successfully.
Dec  1 04:15:26 np0005540741 podman[97914]: 2025-12-01 09:15:26.091584179 +0000 UTC m=+0.114991170 container attach d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_banach, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec  1 04:15:26 np0005540741 podman[97914]: 2025-12-01 09:15:26.091861558 +0000 UTC m=+0.115268549 container died d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_banach, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec  1 04:15:26 np0005540741 podman[97914]: 2025-12-01 09:15:25.997862005 +0000 UTC m=+0.021269016 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:26 np0005540741 systemd[1]: var-lib-containers-storage-overlay-70e0c6b6e1514faddd4232a8473fd5eb321c4883828105f80e0a3276a193aef6-merged.mount: Deactivated successfully.
Dec  1 04:15:26 np0005540741 podman[97914]: 2025-12-01 09:15:26.129085739 +0000 UTC m=+0.152492730 container remove d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_banach, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec  1 04:15:26 np0005540741 systemd[1]: libpod-conmon-d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9.scope: Deactivated successfully.
Dec  1 04:15:26 np0005540741 podman[97973]: 2025-12-01 09:15:26.320658528 +0000 UTC m=+0.076105116 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:26 np0005540741 podman[97973]: 2025-12-01 09:15:26.464261705 +0000 UTC m=+0.219708253 container create 8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_boyd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:15:26 np0005540741 systemd[1]: Started libpod-conmon-8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45.scope.
Dec  1 04:15:26 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:26 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/379c11194edf416e878dc71ace62775bd6357f827bf6a76c2c0eb0144f65a58f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:26 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/379c11194edf416e878dc71ace62775bd6357f827bf6a76c2c0eb0144f65a58f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:26 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/379c11194edf416e878dc71ace62775bd6357f827bf6a76c2c0eb0144f65a58f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:26 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/379c11194edf416e878dc71ace62775bd6357f827bf6a76c2c0eb0144f65a58f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:26 np0005540741 podman[97973]: 2025-12-01 09:15:26.553587699 +0000 UTC m=+0.309034247 container init 8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_boyd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  1 04:15:26 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Dec  1 04:15:26 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1127224077' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 04:15:26 np0005540741 relaxed_bohr[97888]: 
Dec  1 04:15:26 np0005540741 relaxed_bohr[97888]: {"epoch":1,"fsid":"5620a9fb-e540-5250-a0e8-7aaad5347e3b","modified":"2025-12-01T09:12:18.204879Z","created":"2025-12-01T09:12:18.204879Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"compute-0","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.122.100:3300","nonce":0},{"type":"v1","addr":"192.168.122.100:6789","nonce":0}]},"addr":"192.168.122.100:6789/0","public_addr":"192.168.122.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]}
Dec  1 04:15:26 np0005540741 relaxed_bohr[97888]: dumped monmap epoch 1
Dec  1 04:15:26 np0005540741 podman[97973]: 2025-12-01 09:15:26.562515702 +0000 UTC m=+0.317962250 container start 8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_boyd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:26 np0005540741 podman[97973]: 2025-12-01 09:15:26.566814329 +0000 UTC m=+0.322260897 container attach 8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_boyd, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  1 04:15:26 np0005540741 systemd[1]: libpod-3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109.scope: Deactivated successfully.
Dec  1 04:15:26 np0005540741 podman[97860]: 2025-12-01 09:15:26.578139868 +0000 UTC m=+0.814922320 container died 3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109 (image=quay.io/ceph/ceph:v18, name=relaxed_bohr, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:26 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.3 scrub starts
Dec  1 04:15:26 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.3 scrub ok
Dec  1 04:15:26 np0005540741 systemd[1]: var-lib-containers-storage-overlay-86c0ac1fae0c8f4087fdf12a1783d13fdbbedbc5ec3da1c971df73ea3cec59d3-merged.mount: Deactivated successfully.
Dec  1 04:15:26 np0005540741 podman[97860]: 2025-12-01 09:15:26.626842034 +0000 UTC m=+0.863624456 container remove 3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109 (image=quay.io/ceph/ceph:v18, name=relaxed_bohr, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:26 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Dec  1 04:15:26 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Dec  1 04:15:26 np0005540741 systemd[1]: libpod-conmon-3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109.scope: Deactivated successfully.
Dec  1 04:15:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v103: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"} v 0) v1
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"} v 0) v1
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"} v 0) v1
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"} v 0) v1
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"} v 0) v1
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"} v 0) v1
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:27 np0005540741 python3[98034]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   auth get client.openstack _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e45 do_prune osdmap full prune enabled
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e46 e46: 3 total, 3 up, 3 in
Dec  1 04:15:27 np0005540741 podman[98035]: 2025-12-01 09:15:27.258542159 +0000 UTC m=+0.074003590 container create 9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584 (image=quay.io/ceph/ceph:v18, name=amazing_tharp, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e46: 3 total, 3 up, 3 in
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.065162659s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.925071716s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.053224564s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913154602s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.065100670s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925071716s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.053148270s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913154602s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.053172112s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913414001s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.053153992s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913414001s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.101153374s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961578369s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.101132393s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961578369s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100970268s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961547852s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100948334s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961547852s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.052714348s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913444519s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.052694321s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913444519s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100719452s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961563110s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100702286s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961563110s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100515366s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961517334s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100497246s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961517334s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.052310944s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913497925s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100352287s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961570740s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.052274704s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913497925s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100304604s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961570740s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099813461s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961418152s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099791527s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961418152s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099128723s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961410522s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099099159s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961410522s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051272392s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913627625s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099069595s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961448669s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051211357s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913597107s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099026680s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961448669s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051125526s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913627625s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098785400s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961402893s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051002502s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913635254s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098758698s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961402893s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050975800s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913635254s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050869942s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913658142s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050841331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913658142s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098507881s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961387634s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050839424s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913719177s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050799370s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913719177s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098463058s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961387634s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098623276s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961585999s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098595619s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961585999s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050669670s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913757324s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098088264s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961196899s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050644875s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913757324s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098061562s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961196899s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061552048s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.924736023s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097998619s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961250305s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061717033s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.924995422s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097971916s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961250305s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061690331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924995422s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061531067s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924736023s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097698212s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961128235s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097724915s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961204529s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097672462s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961128235s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097575188s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961067200s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097699165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961204529s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097554207s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961067200s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061326981s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.924926758s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073251724s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184906006s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073219299s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184906006s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098457336s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210304260s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098435402s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210304260s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103429794s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215400696s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103402138s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215400696s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072329521s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184417725s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072308540s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184417725s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072682381s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184875488s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072663307s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184875488s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072598457s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184898376s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072576523s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184898376s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071966171s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184402466s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071948051s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184402466s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097693443s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210258484s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097675323s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210258484s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071633339s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184318542s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071617126s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097608566s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210380554s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097588539s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210380554s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097500801s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210418701s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097480774s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210418701s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068525314s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184318542s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094830513s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210655212s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094977379s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210678101s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068471909s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094768524s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210685730s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094736099s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210678101s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094737053s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210685730s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061299324s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924926758s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097580910s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961235046s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097556114s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961235046s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097126961s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.960968018s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094796181s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210655212s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097102165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.960968018s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067185402s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183311462s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067219734s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183364868s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067164421s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183311462s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067193031s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183364868s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097393990s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961318970s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094496727s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210739136s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097324371s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961318970s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061881065s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.926002502s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094424248s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210739136s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060898781s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.925033569s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065610886s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181999207s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061853409s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.926002502s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065589905s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181999207s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096854210s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961059570s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060843468s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925033569s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096827507s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961059570s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060742378s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.925109863s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060762405s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.925170898s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060714722s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925109863s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060591698s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.925079346s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060490608s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925079346s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060739517s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925170898s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.047507286s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913597107s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094371796s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210922241s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094347954s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210922241s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066887856s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183380127s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066744804s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183380127s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066463470s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183166504s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068760872s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184867859s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068104744s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184867859s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094186783s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210975647s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094164848s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210975647s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064965248s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181869507s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064947128s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181869507s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066201210s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183166504s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093912125s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210968018s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065909386s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.182998657s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093876839s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210968018s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065887451s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.182998657s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064443588s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181739807s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064415932s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181739807s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064523697s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181938171s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064498901s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181938171s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093406677s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210983276s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093382835s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210983276s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063798904s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181411743s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093309402s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210937500s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063775063s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181411743s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093281746s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210937500s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063556671s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181381226s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063532829s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181381226s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063310623s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181175232s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097242355s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215141296s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063269615s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181175232s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097218513s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215141296s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097195625s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215148926s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097170830s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215148926s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097122192s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215202332s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097093582s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097187996s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215385437s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097146034s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097031593s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215385437s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097012520s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057223320s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.175636292s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057195663s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.175636292s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096585274s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215202332s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063093185s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181732178s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096489906s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062900543s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181732178s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063192368s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181236267s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062316895s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181236267s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.19( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.18( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1a( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1e( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1d( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.d( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.c( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.f( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.f( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.9( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.2( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.6( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.7( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.2( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.4( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.4( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.6( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.4( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.5( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.7( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.5( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.3( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.e( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666628838s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502311707s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666577339s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502311707s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045631409s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881553650s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045609474s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881553650s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045528412s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881576538s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045514107s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881576538s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666149139s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502319336s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666124344s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502319336s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045070648s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881378174s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045049667s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881378174s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045023918s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881462097s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045001030s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881462097s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665835381s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502403259s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665820122s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502403259s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044493675s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881225586s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044479370s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881225586s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044302940s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881187439s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044287682s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881187439s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665797234s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502769470s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665776253s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502769470s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043943405s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881034851s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043929100s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043830872s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881057739s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043811798s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881057739s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671725273s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.509086609s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671697617s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509086609s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670864105s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508338928s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670849800s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508338928s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043417931s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880981445s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043404579s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880981445s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670290947s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508003235s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670258522s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508003235s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670177460s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508010864s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670162201s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508010864s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670031548s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508018494s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670013428s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508018494s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042176247s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880577087s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669865608s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508308411s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042231560s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880676270s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042157173s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042071342s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880577087s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669753075s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508308411s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041958809s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880676270s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041932106s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669230461s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508064270s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669622421s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508628845s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669596672s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508628845s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041983604s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881034851s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669599533s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508674622s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041935921s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669544220s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508674622s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041315079s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880500793s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669204712s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508064270s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041291237s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880500793s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669652939s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508903503s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041206360s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880538940s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669572830s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508903503s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041181564s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880538940s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669507027s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508911133s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669469833s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508911133s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041090965s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880569458s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040925980s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880439758s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041068077s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880569458s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669351578s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508926392s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040884972s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880439758s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040772438s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880477905s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034724236s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.874450684s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034696579s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.874450684s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508941650s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669108391s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508995056s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040751457s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880477905s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669086456s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508987427s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508987427s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669073105s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508995056s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669003487s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508941650s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040740967s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880767822s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668955803s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.509048462s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040719986s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880767822s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040763855s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880889893s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668935776s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509048462s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040732384s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880889893s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 podman[98035]: 2025-12-01 09:15:27.220404508 +0000 UTC m=+0.035866019 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669313431s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508926392s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:15:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:15:27 np0005540741 systemd[1]: Started libpod-conmon-9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584.scope.
Dec  1 04:15:27 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:27 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e315c7f38a48a103cce588254720edd7ef126726e0d19dc51abeacc0417919b0/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:27 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e315c7f38a48a103cce588254720edd7ef126726e0d19dc51abeacc0417919b0/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:27 np0005540741 podman[98035]: 2025-12-01 09:15:27.562239746 +0000 UTC m=+0.377701237 container init 9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584 (image=quay.io/ceph/ceph:v18, name=amazing_tharp, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec  1 04:15:27 np0005540741 podman[98035]: 2025-12-01 09:15:27.569590439 +0000 UTC m=+0.385051860 container start 9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584 (image=quay.io/ceph/ceph:v18, name=amazing_tharp, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec  1 04:15:27 np0005540741 podman[98035]: 2025-12-01 09:15:27.57308191 +0000 UTC m=+0.388543401 container attach 9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584 (image=quay.io/ceph/ceph:v18, name=amazing_tharp, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.b scrub starts
Dec  1 04:15:27 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.b scrub ok
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.6 deep-scrub starts
Dec  1 04:15:27 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.6 deep-scrub ok
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]: {
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:15:27 np0005540741 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:        "osd_id": 0,
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:        "type": "bluestore"
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:    },
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:        "osd_id": 1,
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:        "type": "bluestore"
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:    },
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:        "osd_id": 2,
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:        "type": "bluestore"
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]:    }
Dec  1 04:15:27 np0005540741 fervent_boyd[97989]: }
Dec  1 04:15:27 np0005540741 systemd[1]: libpod-8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45.scope: Deactivated successfully.
Dec  1 04:15:27 np0005540741 systemd[1]: libpod-8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45.scope: Consumed 1.264s CPU time.
Dec  1 04:15:27 np0005540741 podman[98083]: 2025-12-01 09:15:27.882332693 +0000 UTC m=+0.029853729 container died 8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_boyd, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:27 np0005540741 systemd[1]: var-lib-containers-storage-overlay-379c11194edf416e878dc71ace62775bd6357f827bf6a76c2c0eb0144f65a58f-merged.mount: Deactivated successfully.
Dec  1 04:15:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e46 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:15:28 np0005540741 podman[98083]: 2025-12-01 09:15:28.034056417 +0000 UTC m=+0.181577453 container remove 8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_boyd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:28 np0005540741 systemd[1]: libpod-conmon-8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45.scope: Deactivated successfully.
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:28 np0005540741 ceph-mgr[75324]: [progress INFO root] update: starting ev c16ab44f-2930-4319-a8cf-17cb81d3e674 (Updating mds.cephfs deployment (+1 -> 1))
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.hrlhzj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) v1
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.hrlhzj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.hrlhzj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:15:28 np0005540741 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Deploying daemon mds.cephfs.compute-0.hrlhzj on compute-0
Dec  1 04:15:28 np0005540741 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Deploying daemon mds.cephfs.compute-0.hrlhzj on compute-0
Dec  1 04:15:28 np0005540741 ceph-mgr[75324]: [progress INFO root] Completed event 6a1eadee-2fd7-4097-9f7f-4e2c6af1e403 (Global Recovery Event) in 10 seconds
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.openstack"} v 0) v1
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1477409912' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Dec  1 04:15:28 np0005540741 amazing_tharp[98055]: [client.openstack]
Dec  1 04:15:28 np0005540741 amazing_tharp[98055]: #011key = AQDWWy1pAAAAABAA0JvObGCkXGU+EEwqsvh/8w==
Dec  1 04:15:28 np0005540741 amazing_tharp[98055]: #011caps mgr = "allow *"
Dec  1 04:15:28 np0005540741 amazing_tharp[98055]: #011caps mon = "profile rbd"
Dec  1 04:15:28 np0005540741 amazing_tharp[98055]: #011caps osd = "profile rbd pool=vms, profile rbd pool=volumes, profile rbd pool=backups, profile rbd pool=images, profile rbd pool=cephfs.cephfs.meta, profile rbd pool=cephfs.cephfs.data"
Dec  1 04:15:28 np0005540741 systemd[1]: libpod-9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584.scope: Deactivated successfully.
Dec  1 04:15:28 np0005540741 conmon[98055]: conmon 9c67f89bb4483983731a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584.scope/container/memory.events
Dec  1 04:15:28 np0005540741 podman[98035]: 2025-12-01 09:15:28.25820229 +0000 UTC m=+1.073663721 container died 9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584 (image=quay.io/ceph/ceph:v18, name=amazing_tharp, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e46 do_prune osdmap full prune enabled
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 e47: 3 total, 3 up, 3 in
Dec  1 04:15:28 np0005540741 systemd[1]: var-lib-containers-storage-overlay-e315c7f38a48a103cce588254720edd7ef126726e0d19dc51abeacc0417919b0-merged.mount: Deactivated successfully.
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e47: 3 total, 3 up, 3 in
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.hrlhzj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.hrlhzj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec  1 04:15:28 np0005540741 ceph-mon[75031]: from='client.? 192.168.122.100:0/1477409912' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.17( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1b( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.11( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.14( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.13( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.13( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.15( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.12( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.15( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.9( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.16( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.a( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.3( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.3( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.6( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.2( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.3( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.5( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.2( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.6( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.9( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.18( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.7( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.c( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.4( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.8( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.4( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1b( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.18( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.1e( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.19( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1c( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:15:28 np0005540741 podman[98035]: 2025-12-01 09:15:28.417880647 +0000 UTC m=+1.233342078 container remove 9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584 (image=quay.io/ceph/ceph:v18, name=amazing_tharp, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.c deep-scrub starts
Dec  1 04:15:28 np0005540741 systemd[1]: libpod-conmon-9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584.scope: Deactivated successfully.
Dec  1 04:15:28 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.c deep-scrub ok
Dec  1 04:15:28 np0005540741 podman[98271]: 2025-12-01 09:15:28.73565538 +0000 UTC m=+0.046459135 container create c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_yalow, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:15:28 np0005540741 systemd[1]: Started libpod-conmon-c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce.scope.
Dec  1 04:15:28 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:28 np0005540741 podman[98271]: 2025-12-01 09:15:28.791030067 +0000 UTC m=+0.101833842 container init c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_yalow, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec  1 04:15:28 np0005540741 podman[98271]: 2025-12-01 09:15:28.80213587 +0000 UTC m=+0.112939625 container start c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_yalow, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:15:28 np0005540741 podman[98271]: 2025-12-01 09:15:28.8059419 +0000 UTC m=+0.116745666 container attach c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_yalow, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Dec  1 04:15:28 np0005540741 beautiful_yalow[98286]: 167 167
Dec  1 04:15:28 np0005540741 podman[98271]: 2025-12-01 09:15:28.711213005 +0000 UTC m=+0.022016790 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:28 np0005540741 systemd[1]: libpod-c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce.scope: Deactivated successfully.
Dec  1 04:15:28 np0005540741 podman[98271]: 2025-12-01 09:15:28.808890534 +0000 UTC m=+0.119694289 container died c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_yalow, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:28 np0005540741 systemd[1]: var-lib-containers-storage-overlay-a8c1ac4085fa0fa47d047a733f6050cd0bf42faafa9103b66e23a1a898800365-merged.mount: Deactivated successfully.
Dec  1 04:15:28 np0005540741 podman[98271]: 2025-12-01 09:15:28.94776121 +0000 UTC m=+0.258564965 container remove c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_yalow, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:28 np0005540741 systemd[1]: libpod-conmon-c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce.scope: Deactivated successfully.
Dec  1 04:15:28 np0005540741 systemd[1]: Reloading.
Dec  1 04:15:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v106: 193 pgs: 46 peering, 147 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:29 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:15:29 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:15:29 np0005540741 systemd[1]: Reloading.
Dec  1 04:15:29 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.b scrub starts
Dec  1 04:15:29 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.b scrub ok
Dec  1 04:15:29 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:15:29 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:15:29 np0005540741 ceph-mon[75031]: Deploying daemon mds.cephfs.compute-0.hrlhzj on compute-0
Dec  1 04:15:29 np0005540741 systemd[1]: Starting Ceph mds.cephfs.compute-0.hrlhzj for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec  1 04:15:30 np0005540741 ansible-async_wrapper.py[98543]: Invoked with j521419402164 30 /home/zuul/.ansible/tmp/ansible-tmp-1764580529.4487615-36621-159398386950128/AnsiballZ_command.py _
Dec  1 04:15:30 np0005540741 ansible-async_wrapper.py[98585]: Starting module and watcher
Dec  1 04:15:30 np0005540741 ansible-async_wrapper.py[98585]: Start watching 98586 (30)
Dec  1 04:15:30 np0005540741 ansible-async_wrapper.py[98586]: Start module (98586)
Dec  1 04:15:30 np0005540741 ansible-async_wrapper.py[98543]: Return async_wrapper task started.
Dec  1 04:15:30 np0005540741 podman[98588]: 2025-12-01 09:15:30.232177597 +0000 UTC m=+0.048498910 container create bd39bff8d9d91d0b5e01eaef8e36288bce2634cdb58658e44a05dec668f72782 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mds-cephfs-compute-0-hrlhzj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00c160e10467ebfdc74554f624bdb1b974dd2288c23f65838fc3b97b9eb35861/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00c160e10467ebfdc74554f624bdb1b974dd2288c23f65838fc3b97b9eb35861/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00c160e10467ebfdc74554f624bdb1b974dd2288c23f65838fc3b97b9eb35861/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00c160e10467ebfdc74554f624bdb1b974dd2288c23f65838fc3b97b9eb35861/merged/var/lib/ceph/mds/ceph-cephfs.compute-0.hrlhzj supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:30 np0005540741 podman[98588]: 2025-12-01 09:15:30.289685741 +0000 UTC m=+0.106007074 container init bd39bff8d9d91d0b5e01eaef8e36288bce2634cdb58658e44a05dec668f72782 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mds-cephfs-compute-0-hrlhzj, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec  1 04:15:30 np0005540741 podman[98588]: 2025-12-01 09:15:30.294961519 +0000 UTC m=+0.111282832 container start bd39bff8d9d91d0b5e01eaef8e36288bce2634cdb58658e44a05dec668f72782 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mds-cephfs-compute-0-hrlhzj, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:30 np0005540741 bash[98588]: bd39bff8d9d91d0b5e01eaef8e36288bce2634cdb58658e44a05dec668f72782
Dec  1 04:15:30 np0005540741 podman[98588]: 2025-12-01 09:15:30.204132537 +0000 UTC m=+0.020453870 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:30 np0005540741 systemd[1]: Started Ceph mds.cephfs.compute-0.hrlhzj for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec  1 04:15:30 np0005540741 python3[98587]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch status --format json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:30 np0005540741 ceph-mds[98608]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:15:30 np0005540741 ceph-mds[98608]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mds, pid 2
Dec  1 04:15:30 np0005540741 ceph-mds[98608]: main not setting numa affinity
Dec  1 04:15:30 np0005540741 ceph-mds[98608]: pidfile_write: ignore empty --pid-file
Dec  1 04:15:30 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mds-cephfs-compute-0-hrlhzj[98604]: starting mds.cephfs.compute-0.hrlhzj at 
Dec  1 04:15:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:15:30 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:15:30 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj Updating MDS map to version 2 from mon.0
Dec  1 04:15:30 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mds.cephfs}] v 0) v1
Dec  1 04:15:30 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:30 np0005540741 ceph-mgr[75324]: [progress INFO root] complete: finished ev c16ab44f-2930-4319-a8cf-17cb81d3e674 (Updating mds.cephfs deployment (+1 -> 1))
Dec  1 04:15:30 np0005540741 ceph-mgr[75324]: [progress INFO root] Completed event c16ab44f-2930-4319-a8cf-17cb81d3e674 (Updating mds.cephfs deployment (+1 -> 1)) in 2 seconds
Dec  1 04:15:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mds_join_fs}] v 0) v1
Dec  1 04:15:30 np0005540741 podman[98609]: 2025-12-01 09:15:30.413459839 +0000 UTC m=+0.069704423 container create ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866 (image=quay.io/ceph/ceph:v18, name=friendly_meninsky, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:30 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mds.cephfs}] v 0) v1
Dec  1 04:15:30 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.e scrub starts
Dec  1 04:15:30 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:30 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.e scrub ok
Dec  1 04:15:30 np0005540741 systemd[1]: Started libpod-conmon-ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866.scope.
Dec  1 04:15:30 np0005540741 podman[98609]: 2025-12-01 09:15:30.372730807 +0000 UTC m=+0.028975421 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:30 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2d0685b03cbe14b6087e0188b14446f4ae223cdb47b171a1fd3529276a98ea0/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2d0685b03cbe14b6087e0188b14446f4ae223cdb47b171a1fd3529276a98ea0/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:30 np0005540741 podman[98609]: 2025-12-01 09:15:30.500063757 +0000 UTC m=+0.156308371 container init ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866 (image=quay.io/ceph/ceph:v18, name=friendly_meninsky, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:30 np0005540741 podman[98609]: 2025-12-01 09:15:30.512329286 +0000 UTC m=+0.168573870 container start ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866 (image=quay.io/ceph/ceph:v18, name=friendly_meninsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec  1 04:15:30 np0005540741 podman[98609]: 2025-12-01 09:15:30.516585701 +0000 UTC m=+0.172830295 container attach ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866 (image=quay.io/ceph/ceph:v18, name=friendly_meninsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:15:30 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:30 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:30 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:30 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:30 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v107: 193 pgs: 46 peering, 147 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:31 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14256 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec  1 04:15:31 np0005540741 friendly_meninsky[98661]: 
Dec  1 04:15:31 np0005540741 friendly_meninsky[98661]: {"available": true, "backend": "cephadm", "paused": false, "workers": 10}
Dec  1 04:15:31 np0005540741 systemd[1]: libpod-ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866.scope: Deactivated successfully.
Dec  1 04:15:31 np0005540741 podman[98609]: 2025-12-01 09:15:31.100963765 +0000 UTC m=+0.757208359 container died ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866 (image=quay.io/ceph/ceph:v18, name=friendly_meninsky, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec  1 04:15:31 np0005540741 systemd[1]: var-lib-containers-storage-overlay-e2d0685b03cbe14b6087e0188b14446f4ae223cdb47b171a1fd3529276a98ea0-merged.mount: Deactivated successfully.
Dec  1 04:15:31 np0005540741 podman[98609]: 2025-12-01 09:15:31.160390851 +0000 UTC m=+0.816635435 container remove ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866 (image=quay.io/ceph/ceph:v18, name=friendly_meninsky, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec  1 04:15:31 np0005540741 systemd[1]: libpod-conmon-ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866.scope: Deactivated successfully.
Dec  1 04:15:31 np0005540741 ansible-async_wrapper.py[98586]: Module complete (98586)
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).mds e3 new map
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).mds e3 print_map#012e3#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:15:19.277019+0000#012modified#0112025-12-01T09:15:19.277097+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.hrlhzj{-1:14254} state up:standby seq 1 addr [v2:192.168.122.100:6814/1560103012,v1:192.168.122.100:6815/1560103012] compat {c=[1],r=[1],i=[7ff]}]
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj Updating MDS map to version 3 from mon.0
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj Monitors have assigned me to become a standby.
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : mds.? [v2:192.168.122.100:6814/1560103012,v1:192.168.122.100:6815/1560103012] up:boot
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).mds e3 assigned standby [v2:192.168.122.100:6814/1560103012,v1:192.168.122.100:6815/1560103012] as mds.0
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : daemon mds.cephfs.compute-0.hrlhzj assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : Cluster is now healthy
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : fsmap cephfs:0 1 up:standby
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mds metadata", "who": "cephfs.compute-0.hrlhzj"} v 0) v1
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.hrlhzj"}]: dispatch
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).mds e3 all = 0
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).mds e4 new map
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).mds e4 print_map#012e4#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:15:19.277019+0000#012modified#0112025-12-01T09:15:31.359857+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=14254}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-0.hrlhzj{0:14254} state up:creating seq 1 addr [v2:192.168.122.100:6814/1560103012,v1:192.168.122.100:6815/1560103012] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=cephfs.compute-0.hrlhzj=up:creating}
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj Updating MDS map to version 4 from mon.0
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.0.4 handle_mds_map i am now mds.0.4
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x1
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x100
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x600
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x601
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x602
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x603
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x604
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x605
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x606
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x607
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x608
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x609
Dec  1 04:15:31 np0005540741 ceph-mds[98608]: mds.0.4 creating_done
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: log_channel(cluster) log [INF] : daemon mds.cephfs.compute-0.hrlhzj is now active in filesystem cephfs as rank 0
Dec  1 04:15:31 np0005540741 podman[98953]: 2025-12-01 09:15:31.435729407 +0000 UTC m=+0.062605187 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:31 np0005540741 python3[98940]: ansible-ansible.legacy.async_status Invoked with jid=j521419402164.98543 mode=status _async_dir=/root/.ansible_async
Dec  1 04:15:31 np0005540741 podman[98953]: 2025-12-01 09:15:31.54579638 +0000 UTC m=+0.172672140 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:15:31 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.c scrub starts
Dec  1 04:15:31 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.c scrub ok
Dec  1 04:15:31 np0005540741 python3[99056]: ansible-ansible.legacy.async_status Invoked with jid=j521419402164.98543 mode=cleanup _async_dir=/root/.ansible_async
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: daemon mds.cephfs.compute-0.hrlhzj assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: Cluster is now healthy
Dec  1 04:15:31 np0005540741 ceph-mon[75031]: daemon mds.cephfs.compute-0.hrlhzj is now active in filesystem cephfs as rank 0
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:32 np0005540741 python3[99197]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch status --format json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).mds e5 new map
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).mds e5 print_map#012e5#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:15:19.277019+0000#012modified#0112025-12-01T09:15:32.363675+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=14254}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-0.hrlhzj{0:14254} state up:active seq 2 join_fscid=1 addr [v2:192.168.122.100:6814/1560103012,v1:192.168.122.100:6815/1560103012] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Dec  1 04:15:32 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj Updating MDS map to version 5 from mon.0
Dec  1 04:15:32 np0005540741 ceph-mds[98608]: mds.0.4 handle_mds_map i am now mds.0.4
Dec  1 04:15:32 np0005540741 ceph-mds[98608]: mds.0.4 handle_mds_map state change up:creating --> up:active
Dec  1 04:15:32 np0005540741 ceph-mds[98608]: mds.0.4 recovery_done -- successful recovery!
Dec  1 04:15:32 np0005540741 ceph-mds[98608]: mds.0.4 active_start
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : mds.? [v2:192.168.122.100:6814/1560103012,v1:192.168.122.100:6815/1560103012] up:active
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=cephfs.compute-0.hrlhzj=up:active}
Dec  1 04:15:32 np0005540741 podman[99250]: 2025-12-01 09:15:32.384269306 +0000 UTC m=+0.030241070 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:32 np0005540741 podman[99250]: 2025-12-01 09:15:32.483633978 +0000 UTC m=+0.129605702 container create e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78 (image=quay.io/ceph/ceph:v18, name=gifted_blackwell, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:32 np0005540741 systemd[1]: Started libpod-conmon-e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78.scope.
Dec  1 04:15:32 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:32 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6b7fbf8bbcb4a8db419853aac851376744b8b07f6f4493d257b013328d49840/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:32 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6b7fbf8bbcb4a8db419853aac851376744b8b07f6f4493d257b013328d49840/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:32 np0005540741 podman[99250]: 2025-12-01 09:15:32.554930511 +0000 UTC m=+0.200902245 container init e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78 (image=quay.io/ceph/ceph:v18, name=gifted_blackwell, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec  1 04:15:32 np0005540741 podman[99250]: 2025-12-01 09:15:32.561632883 +0000 UTC m=+0.207604607 container start e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78 (image=quay.io/ceph/ceph:v18, name=gifted_blackwell, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True)
Dec  1 04:15:32 np0005540741 podman[99250]: 2025-12-01 09:15:32.564847936 +0000 UTC m=+0.210819660 container attach e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78 (image=quay.io/ceph/ceph:v18, name=gifted_blackwell, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:32 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 8f31d491-da8e-4f01-8c16-8a37e79ad638 does not exist
Dec  1 04:15:32 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev cd2a7ad4-5e6a-404c-b778-2976ccd148f6 does not exist
Dec  1 04:15:32 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 79bcb737-6fe1-4b25-a9f9-4578d8521ef9 does not exist
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:15:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:15:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v108: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 127 B/s wr, 1 op/s
Dec  1 04:15:33 np0005540741 ceph-mgr[75324]: [progress INFO root] Writing back 11 completed events
Dec  1 04:15:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) v1
Dec  1 04:15:33 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:33 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14258 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec  1 04:15:33 np0005540741 gifted_blackwell[99299]: 
Dec  1 04:15:33 np0005540741 gifted_blackwell[99299]: {"available": true, "backend": "cephadm", "paused": false, "workers": 10}
Dec  1 04:15:33 np0005540741 systemd[1]: libpod-e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78.scope: Deactivated successfully.
Dec  1 04:15:33 np0005540741 podman[99250]: 2025-12-01 09:15:33.214869692 +0000 UTC m=+0.860841416 container died e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78 (image=quay.io/ceph/ceph:v18, name=gifted_blackwell, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec  1 04:15:33 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b6b7fbf8bbcb4a8db419853aac851376744b8b07f6f4493d257b013328d49840-merged.mount: Deactivated successfully.
Dec  1 04:15:33 np0005540741 podman[99250]: 2025-12-01 09:15:33.315635229 +0000 UTC m=+0.961606943 container remove e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78 (image=quay.io/ceph/ceph:v18, name=gifted_blackwell, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Dec  1 04:15:33 np0005540741 systemd[1]: libpod-conmon-e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78.scope: Deactivated successfully.
Dec  1 04:15:33 np0005540741 podman[99500]: 2025-12-01 09:15:33.539305797 +0000 UTC m=+0.046159836 container create eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_jemison, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Dec  1 04:15:33 np0005540741 systemd[1]: Started libpod-conmon-eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de.scope.
Dec  1 04:15:33 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:33 np0005540741 podman[99500]: 2025-12-01 09:15:33.521310736 +0000 UTC m=+0.028164795 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:33 np0005540741 podman[99500]: 2025-12-01 09:15:33.677705059 +0000 UTC m=+0.184559148 container init eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_jemison, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  1 04:15:33 np0005540741 podman[99500]: 2025-12-01 09:15:33.683284806 +0000 UTC m=+0.190138845 container start eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_jemison, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  1 04:15:33 np0005540741 podman[99500]: 2025-12-01 09:15:33.686716374 +0000 UTC m=+0.193570423 container attach eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_jemison, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:15:33 np0005540741 nervous_jemison[99516]: 167 167
Dec  1 04:15:33 np0005540741 systemd[1]: libpod-eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de.scope: Deactivated successfully.
Dec  1 04:15:33 np0005540741 podman[99500]: 2025-12-01 09:15:33.688601964 +0000 UTC m=+0.195456043 container died eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_jemison, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:33 np0005540741 systemd[1]: var-lib-containers-storage-overlay-16ffc648c1d1975b8cc2180fa4b84de8979d1623c4d8c5401adec8b02702e082-merged.mount: Deactivated successfully.
Dec  1 04:15:33 np0005540741 podman[99500]: 2025-12-01 09:15:33.729071648 +0000 UTC m=+0.235925687 container remove eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_jemison, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:33 np0005540741 systemd[1]: libpod-conmon-eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de.scope: Deactivated successfully.
Dec  1 04:15:33 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:15:33 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:33 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:15:33 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:33 np0005540741 podman[99540]: 2025-12-01 09:15:33.888478367 +0000 UTC m=+0.039873807 container create 128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_ritchie, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:15:33 np0005540741 systemd[1]: Started libpod-conmon-128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063.scope.
Dec  1 04:15:33 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:33 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab0354d0c607acd3b0ed1cfb2916043e3b0d68fee654fc58b100b09320876852/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:33 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab0354d0c607acd3b0ed1cfb2916043e3b0d68fee654fc58b100b09320876852/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:33 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab0354d0c607acd3b0ed1cfb2916043e3b0d68fee654fc58b100b09320876852/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:33 np0005540741 podman[99540]: 2025-12-01 09:15:33.870696822 +0000 UTC m=+0.022092302 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:33 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab0354d0c607acd3b0ed1cfb2916043e3b0d68fee654fc58b100b09320876852/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:33 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab0354d0c607acd3b0ed1cfb2916043e3b0d68fee654fc58b100b09320876852/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:33 np0005540741 podman[99540]: 2025-12-01 09:15:33.979438673 +0000 UTC m=+0.130834133 container init 128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_ritchie, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:33 np0005540741 podman[99540]: 2025-12-01 09:15:33.98595726 +0000 UTC m=+0.137352710 container start 128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_ritchie, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec  1 04:15:33 np0005540741 podman[99540]: 2025-12-01 09:15:33.989678568 +0000 UTC m=+0.141074018 container attach 128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_ritchie, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Dec  1 04:15:34 np0005540741 python3[99587]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch ls --export -f json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:34 np0005540741 podman[99588]: 2025-12-01 09:15:34.235104356 +0000 UTC m=+0.039735742 container create f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797 (image=quay.io/ceph/ceph:v18, name=exciting_goldwasser, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec  1 04:15:34 np0005540741 systemd[1]: Started libpod-conmon-f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797.scope.
Dec  1 04:15:34 np0005540741 podman[99588]: 2025-12-01 09:15:34.217012652 +0000 UTC m=+0.021644058 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:34 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:34 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a68a9202e7b1ba306129f0cef1a1ffcc19f77dfd6d13a79c314edc07f54d3db6/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:34 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a68a9202e7b1ba306129f0cef1a1ffcc19f77dfd6d13a79c314edc07f54d3db6/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:34 np0005540741 podman[99588]: 2025-12-01 09:15:34.359803463 +0000 UTC m=+0.164434869 container init f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797 (image=quay.io/ceph/ceph:v18, name=exciting_goldwasser, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:15:34 np0005540741 podman[99588]: 2025-12-01 09:15:34.365262866 +0000 UTC m=+0.169894252 container start f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797 (image=quay.io/ceph/ceph:v18, name=exciting_goldwasser, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:15:34 np0005540741 podman[99588]: 2025-12-01 09:15:34.36855409 +0000 UTC m=+0.173185496 container attach f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797 (image=quay.io/ceph/ceph:v18, name=exciting_goldwasser, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec  1 04:15:34 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.d deep-scrub starts
Dec  1 04:15:34 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.d deep-scrub ok
Dec  1 04:15:34 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14260 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec  1 04:15:34 np0005540741 exciting_goldwasser[99604]: 
Dec  1 04:15:34 np0005540741 exciting_goldwasser[99604]: [{"placement": {"host_pattern": "*"}, "service_name": "crash", "service_type": "crash"}, {"placement": {"hosts": ["compute-0"]}, "service_id": "cephfs", "service_name": "mds.cephfs", "service_type": "mds"}, {"placement": {"hosts": ["compute-0"]}, "service_name": "mgr", "service_type": "mgr"}, {"placement": {"hosts": ["compute-0"]}, "service_name": "mon", "service_type": "mon"}, {"placement": {"hosts": ["compute-0"]}, "service_id": "default_drive_group", "service_name": "osd.default_drive_group", "service_type": "osd", "spec": {"data_devices": {"paths": ["/dev/ceph_vg0/ceph_lv0", "/dev/ceph_vg1/ceph_lv1", "/dev/ceph_vg2/ceph_lv2"]}, "filter_logic": "AND", "objectstore": "bluestore"}}]
Dec  1 04:15:34 np0005540741 systemd[1]: libpod-f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797.scope: Deactivated successfully.
Dec  1 04:15:34 np0005540741 conmon[99604]: conmon f767a7517ca2e52adef8 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797.scope/container/memory.events
Dec  1 04:15:34 np0005540741 podman[99588]: 2025-12-01 09:15:34.983422461 +0000 UTC m=+0.788053847 container died f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797 (image=quay.io/ceph/ceph:v18, name=exciting_goldwasser, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec  1 04:15:34 np0005540741 funny_ritchie[99557]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:15:34 np0005540741 funny_ritchie[99557]: --> relative data size: 1.0
Dec  1 04:15:34 np0005540741 funny_ritchie[99557]: --> All data devices are unavailable
Dec  1 04:15:35 np0005540741 systemd[1]: libpod-128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063.scope: Deactivated successfully.
Dec  1 04:15:35 np0005540741 podman[99540]: 2025-12-01 09:15:35.036249348 +0000 UTC m=+1.187644798 container died 128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_ritchie, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Dec  1 04:15:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v109: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 127 B/s wr, 1 op/s
Dec  1 04:15:35 np0005540741 systemd[1]: var-lib-containers-storage-overlay-a68a9202e7b1ba306129f0cef1a1ffcc19f77dfd6d13a79c314edc07f54d3db6-merged.mount: Deactivated successfully.
Dec  1 04:15:35 np0005540741 systemd[1]: var-lib-containers-storage-overlay-ab0354d0c607acd3b0ed1cfb2916043e3b0d68fee654fc58b100b09320876852-merged.mount: Deactivated successfully.
Dec  1 04:15:35 np0005540741 podman[99588]: 2025-12-01 09:15:35.076947839 +0000 UTC m=+0.881579225 container remove f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797 (image=quay.io/ceph/ceph:v18, name=exciting_goldwasser, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec  1 04:15:35 np0005540741 systemd[1]: libpod-conmon-f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797.scope: Deactivated successfully.
Dec  1 04:15:35 np0005540741 podman[99540]: 2025-12-01 09:15:35.103406929 +0000 UTC m=+1.254802379 container remove 128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_ritchie, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:35 np0005540741 systemd[1]: libpod-conmon-128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063.scope: Deactivated successfully.
Dec  1 04:15:35 np0005540741 ansible-async_wrapper.py[98585]: Done in kid B.
Dec  1 04:15:35 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.10 deep-scrub starts
Dec  1 04:15:35 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.10 deep-scrub ok
Dec  1 04:15:35 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Dec  1 04:15:35 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Dec  1 04:15:35 np0005540741 podman[99818]: 2025-12-01 09:15:35.78952376 +0000 UTC m=+0.057304519 container create 432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_vaughan, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:15:35 np0005540741 systemd[1]: Started libpod-conmon-432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e.scope.
Dec  1 04:15:35 np0005540741 podman[99818]: 2025-12-01 09:15:35.760689315 +0000 UTC m=+0.028470174 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:35 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:35 np0005540741 podman[99818]: 2025-12-01 09:15:35.886985413 +0000 UTC m=+0.154766212 container init 432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_vaughan, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec  1 04:15:35 np0005540741 podman[99818]: 2025-12-01 09:15:35.895257045 +0000 UTC m=+0.163037814 container start 432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_vaughan, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:35 np0005540741 podman[99818]: 2025-12-01 09:15:35.899321564 +0000 UTC m=+0.167102363 container attach 432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_vaughan, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:15:35 np0005540741 determined_vaughan[99840]: 167 167
Dec  1 04:15:35 np0005540741 systemd[1]: libpod-432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e.scope: Deactivated successfully.
Dec  1 04:15:35 np0005540741 podman[99818]: 2025-12-01 09:15:35.902858027 +0000 UTC m=+0.170638796 container died 432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_vaughan, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 04:15:35 np0005540741 systemd[1]: var-lib-containers-storage-overlay-af16760c2bd1c566ef056dabf05a1620c771446645ac8be5c8f0fae3e1157f94-merged.mount: Deactivated successfully.
Dec  1 04:15:35 np0005540741 podman[99818]: 2025-12-01 09:15:35.940998227 +0000 UTC m=+0.208778996 container remove 432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_vaughan, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:35 np0005540741 systemd[1]: libpod-conmon-432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e.scope: Deactivated successfully.
Dec  1 04:15:36 np0005540741 python3[99865]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch ps -f json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:36 np0005540741 podman[99883]: 2025-12-01 09:15:36.152846078 +0000 UTC m=+0.089790800 container create dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banach, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:15:36 np0005540741 podman[99883]: 2025-12-01 09:15:36.092200384 +0000 UTC m=+0.029145086 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:36 np0005540741 systemd[1]: Started libpod-conmon-dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e.scope.
Dec  1 04:15:36 np0005540741 podman[99895]: 2025-12-01 09:15:36.212412688 +0000 UTC m=+0.116093055 container create 61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b (image=quay.io/ceph/ceph:v18, name=vibrant_agnesi, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec  1 04:15:36 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:36 np0005540741 systemd[1]: Started libpod-conmon-61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b.scope.
Dec  1 04:15:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4e3c611eb7fe346f2906fcf4480a112e83c7875b135c99ff80b639b22dd8cf8/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4e3c611eb7fe346f2906fcf4480a112e83c7875b135c99ff80b639b22dd8cf8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4e3c611eb7fe346f2906fcf4480a112e83c7875b135c99ff80b639b22dd8cf8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4e3c611eb7fe346f2906fcf4480a112e83c7875b135c99ff80b639b22dd8cf8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:36 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9998c787f466d491288c66d6cd37a30bf6ca01dbd2187f78fb60a702b52e24f/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9998c787f466d491288c66d6cd37a30bf6ca01dbd2187f78fb60a702b52e24f/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:36 np0005540741 podman[99883]: 2025-12-01 09:15:36.262178687 +0000 UTC m=+0.199123409 container init dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banach, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec  1 04:15:36 np0005540741 podman[99883]: 2025-12-01 09:15:36.270588604 +0000 UTC m=+0.207533306 container start dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banach, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:36 np0005540741 podman[99895]: 2025-12-01 09:15:36.178790581 +0000 UTC m=+0.082471008 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:36 np0005540741 podman[99895]: 2025-12-01 09:15:36.27550838 +0000 UTC m=+0.179188807 container init 61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b (image=quay.io/ceph/ceph:v18, name=vibrant_agnesi, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:36 np0005540741 podman[99895]: 2025-12-01 09:15:36.281934834 +0000 UTC m=+0.185615231 container start 61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b (image=quay.io/ceph/ceph:v18, name=vibrant_agnesi, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:15:36 np0005540741 podman[99883]: 2025-12-01 09:15:36.282377248 +0000 UTC m=+0.219321960 container attach dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banach, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:36 np0005540741 podman[99895]: 2025-12-01 09:15:36.286964564 +0000 UTC m=+0.190644941 container attach 61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b (image=quay.io/ceph/ceph:v18, name=vibrant_agnesi, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec  1 04:15:36 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14262 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec  1 04:15:36 np0005540741 vibrant_agnesi[99917]: 
Dec  1 04:15:36 np0005540741 vibrant_agnesi[99917]: [{"container_id": "83d60e6b432c", "container_image_digests": ["quay.io/ceph/ceph@sha256:7d8bb82696d5d9cbeae2a2828dc12b6835aa2dded890fa3ac5a733cb66b72b1c", "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0"], "container_image_id": "0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1", "container_image_name": "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0", "cpu_percentage": "0.39%", "created": "2025-12-01T09:13:40.947545Z", "daemon_id": "compute-0", "daemon_name": "crash.compute-0", "daemon_type": "crash", "events": ["2025-12-01T09:13:41.007313Z daemon:crash.compute-0 [INFO] \"Deployed crash.compute-0 on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-01T09:15:32.118088Z", "memory_usage": 11586764, "ports": [], "service_name": "crash", "started": "2025-12-01T09:13:40.847848Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@crash.compute-0", "version": "18.2.7"}, {"container_id": "bd39bff8d9d9", "container_image_digests": ["quay.io/ceph/ceph@sha256:7d8bb82696d5d9cbeae2a2828dc12b6835aa2dded890fa3ac5a733cb66b72b1c", "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0"], "container_image_id": "0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1", "container_image_name": "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0", "cpu_percentage": "6.98%", "created": "2025-12-01T09:15:30.308898Z", "daemon_id": "cephfs.compute-0.hrlhzj", "daemon_name": "mds.cephfs.compute-0.hrlhzj", "daemon_type": "mds", "events": ["2025-12-01T09:15:30.360426Z daemon:mds.cephfs.compute-0.hrlhzj [INFO] \"Deployed mds.cephfs.compute-0.hrlhzj on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-01T09:15:32.118409Z", "memory_usage": 15833497, "ports": [], "service_name": "mds.cephfs", "started": "2025-12-01T09:15:30.209936Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@mds.cephfs.compute-0.hrlhzj", "version": "18.2.7"}, {"container_id": "d04e39f95959", "container_image_digests": ["quay.io/ceph/ceph@sha256:7d8bb82696d5d9cbeae2a2828dc12b6835aa2dded890fa3ac5a733cb66b72b1c", "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0"], "container_image_id": "0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1", "container_image_name": "quay.io/ceph/ceph:v18", "cpu_percentage": "26.87%", "created": "2025-12-01T09:12:25.057330Z", "daemon_id": "compute-0.psduho", "daemon_name": "mgr.compute-0.psduho", "daemon_type": "mgr", "events": ["2025-12-01T09:14:43.592702Z daemon:mgr.compute-0.psduho [INFO] \"Reconfigured mgr.compute-0.psduho on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-01T09:15:32.118023Z", "memory_usage": 549873254, "ports": [9283, 8765], "service_name": "mgr", "started": "2025-12-01T09:12:24.972967Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@mgr.compute-0.psduho", "version": "18.2.7"}, {"container_id": "a46df485ce4f", "container_image_digests": ["quay.io/ceph/ceph@sha256:7d8bb82696d5d9cbeae2a2828dc12b6835aa2dded890fa3ac5a733cb66b72b1c", "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0"], "container_image_id": "0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1", "container_image_name": "quay.io/ceph/ceph:v18", "cpu_percentage": "1.96%", "created": "2025-12-01T09:12:20.053188Z", "daemon_id": "compute-0", "daemon_name": "mon.compute-0", "daemon_type": "mon", "events": ["2025-12-01T09:14:42.780063Z daemon:mon.compute-0 [INFO] \"Reconfigured mon.compute-0 on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-01T09:15:32.117922Z", "memory_request": 2147483648, "memory_usage": 39992688, "ports": [], "service_name": "mon", "started": "2025-12-01T09:12:22.666870Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@mon.compute-0", "version": "18.2.7"}, {"container_id": "b27d497db5b1", "container_image_digests": ["quay.io/ceph/ceph@sha256:7d8bb82696d5d9cbeae2a2828dc12b6835aa2dded890fa3ac5a733cb66b72b1c", "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0"], "container_image_id": "0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1", "container_image_name": "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0", "cpu_percentage": "1.95%", "created": "2025-12-01T09:14:08.958385Z", "daemon_id": "0", "daemon_name": "osd.0", "daemon_type": "osd", "events": ["2025-12-01T09:14:09.023175Z daemon:osd.0 [INFO] \"Deployed osd.0 on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-01T09:15:32.118154Z", "memory_request": 4294967296, "memory_usage": 67360522, "ports": [], "service_name": "osd.default_drive_group", "started": "2025-12-01T09:14:08.869614Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@osd.0", "version": "18.2.7"}, {"container_id": "2203330e3b4c", "container_image_digests": ["quay.io/ceph/ceph@sha256:7d8bb82696d5d9cbeae2a2828dc12b6835aa2dded890fa3ac5a733cb66b72b1c", "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0"], "container_image_id": "0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1", "container_image_name": "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0", "cpu_percentage": "2.25%", "created": "2025-12-01T09:14:13.757531Z", "daemon_id": "1", "daemon_name": "osd.1", "daemon_type": "osd", "events": ["2025-12-01T09:14:13.869959Z daemon:osd.1 [INFO] \"Deployed osd.1 on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-01T09:15:32.118215Z", "memory_request": 4294967296, "memory_usage": 66007859, "ports": [], "service_name": "osd.default_drive_group", "started": "2025-12-01T09:14:13.345532Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@osd.1", "version": "18.2.7"}, {"container_id": "b8cc745a8217", "container_image_digests": ["quay.io/ceph/ceph@sha256:7d8bb82696d5d9cbeae2a2828dc12b6835aa2dded890fa3ac5a733cb66b72b1c", "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0"], "container_image_id": "0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1", "container_image_name": "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0", "cpu_percentage": "3.05%", "created": "2025-12-01T09:14:22.409795Z", "daemon_id": "2", "daemon_name": "osd.2", "daemon_type": "osd", "events": ["2025-12-01T09:14:22.523036Z daemon:osd.2 [INFO] \"Deployed osd.2 on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-01T09:15:32.118274Z", "memory_request": 4294967296, "memory_usage": 66175631, "ports": [], "service_name": "osd.default_drive_group", "started": "2025-12-01T09:14:22.192700Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@osd.2", "version": "18.2.7"}]
Dec  1 04:15:36 np0005540741 systemd[1]: libpod-61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b.scope: Deactivated successfully.
Dec  1 04:15:36 np0005540741 podman[99895]: 2025-12-01 09:15:36.84749373 +0000 UTC m=+0.751174137 container died 61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b (image=quay.io/ceph/ceph:v18, name=vibrant_agnesi, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:15:36 np0005540741 systemd[1]: var-lib-containers-storage-overlay-f9998c787f466d491288c66d6cd37a30bf6ca01dbd2187f78fb60a702b52e24f-merged.mount: Deactivated successfully.
Dec  1 04:15:36 np0005540741 podman[99895]: 2025-12-01 09:15:36.900891155 +0000 UTC m=+0.804571532 container remove 61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b (image=quay.io/ceph/ceph:v18, name=vibrant_agnesi, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:36 np0005540741 systemd[1]: libpod-conmon-61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b.scope: Deactivated successfully.
Dec  1 04:15:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v110: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s wr, 4 op/s
Dec  1 04:15:37 np0005540741 lucid_banach[99912]: {
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:    "0": [
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:        {
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "devices": [
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "/dev/loop3"
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            ],
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "lv_name": "ceph_lv0",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "lv_size": "21470642176",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "name": "ceph_lv0",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "tags": {
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.cluster_name": "ceph",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.crush_device_class": "",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.encrypted": "0",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.osd_id": "0",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.type": "block",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.vdo": "0"
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            },
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "type": "block",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "vg_name": "ceph_vg0"
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:        }
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:    ],
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:    "1": [
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:        {
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "devices": [
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "/dev/loop4"
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            ],
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "lv_name": "ceph_lv1",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "lv_size": "21470642176",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "name": "ceph_lv1",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "tags": {
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.cluster_name": "ceph",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.crush_device_class": "",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.encrypted": "0",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.osd_id": "1",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.type": "block",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.vdo": "0"
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            },
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "type": "block",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "vg_name": "ceph_vg1"
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:        }
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:    ],
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:    "2": [
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:        {
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "devices": [
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "/dev/loop5"
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            ],
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "lv_name": "ceph_lv2",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "lv_size": "21470642176",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "name": "ceph_lv2",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "tags": {
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.cluster_name": "ceph",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.crush_device_class": "",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.encrypted": "0",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.osd_id": "2",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.type": "block",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:                "ceph.vdo": "0"
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            },
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "type": "block",
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:            "vg_name": "ceph_vg2"
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:        }
Dec  1 04:15:37 np0005540741 lucid_banach[99912]:    ]
Dec  1 04:15:37 np0005540741 lucid_banach[99912]: }
Dec  1 04:15:37 np0005540741 systemd[1]: libpod-dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e.scope: Deactivated successfully.
Dec  1 04:15:37 np0005540741 podman[99883]: 2025-12-01 09:15:37.0970578 +0000 UTC m=+1.034002512 container died dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banach, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec  1 04:15:37 np0005540741 systemd[1]: var-lib-containers-storage-overlay-c4e3c611eb7fe346f2906fcf4480a112e83c7875b135c99ff80b639b22dd8cf8-merged.mount: Deactivated successfully.
Dec  1 04:15:37 np0005540741 podman[99883]: 2025-12-01 09:15:37.156067942 +0000 UTC m=+1.093012644 container remove dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banach, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:37 np0005540741 systemd[1]: libpod-conmon-dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e.scope: Deactivated successfully.
Dec  1 04:15:37 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.12 deep-scrub starts
Dec  1 04:15:37 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.12 deep-scrub ok
Dec  1 04:15:37 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Dec  1 04:15:37 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Dec  1 04:15:37 np0005540741 podman[100114]: 2025-12-01 09:15:37.760494352 +0000 UTC m=+0.044056689 container create 4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_yonath, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec  1 04:15:37 np0005540741 systemd[1]: Started libpod-conmon-4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e.scope.
Dec  1 04:15:37 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:37 np0005540741 podman[100114]: 2025-12-01 09:15:37.740802137 +0000 UTC m=+0.024364524 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:37 np0005540741 podman[100114]: 2025-12-01 09:15:37.835884154 +0000 UTC m=+0.119446521 container init 4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_yonath, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  1 04:15:37 np0005540741 podman[100114]: 2025-12-01 09:15:37.84491181 +0000 UTC m=+0.128474157 container start 4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_yonath, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  1 04:15:37 np0005540741 podman[100114]: 2025-12-01 09:15:37.848854856 +0000 UTC m=+0.132417223 container attach 4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_yonath, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:37 np0005540741 vigorous_yonath[100156]: 167 167
Dec  1 04:15:37 np0005540741 systemd[1]: libpod-4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e.scope: Deactivated successfully.
Dec  1 04:15:37 np0005540741 conmon[100156]: conmon 4834e8f5f358af76f540 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e.scope/container/memory.events
Dec  1 04:15:37 np0005540741 podman[100114]: 2025-12-01 09:15:37.852314045 +0000 UTC m=+0.135876402 container died 4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_yonath, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:15:37 np0005540741 systemd[1]: var-lib-containers-storage-overlay-e00d8f7c9645ade1560c28813eb79379342aa5928e31bf433b939565a5f2fb6f-merged.mount: Deactivated successfully.
Dec  1 04:15:37 np0005540741 podman[100114]: 2025-12-01 09:15:37.901026931 +0000 UTC m=+0.184589268 container remove 4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_yonath, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:15:37 np0005540741 systemd[1]: libpod-conmon-4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e.scope: Deactivated successfully.
Dec  1 04:15:37 np0005540741 python3[100153]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   -s -f json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:37 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:15:37 np0005540741 podman[100175]: 2025-12-01 09:15:37.991355167 +0000 UTC m=+0.042544941 container create c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43 (image=quay.io/ceph/ceph:v18, name=confident_chatelet, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  1 04:15:38 np0005540741 systemd[1]: Started libpod-conmon-c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43.scope.
Dec  1 04:15:38 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3994441c15770f7bcac96802baa5b10446392f18724c28602e22f1b1d11c642c/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3994441c15770f7bcac96802baa5b10446392f18724c28602e22f1b1d11c642c/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:38 np0005540741 podman[100175]: 2025-12-01 09:15:38.065225291 +0000 UTC m=+0.116415085 container init c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43 (image=quay.io/ceph/ceph:v18, name=confident_chatelet, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  1 04:15:38 np0005540741 podman[100175]: 2025-12-01 09:15:37.972137468 +0000 UTC m=+0.023327262 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:38 np0005540741 podman[100195]: 2025-12-01 09:15:38.069891679 +0000 UTC m=+0.048821690 container create 057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_perlman, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec  1 04:15:38 np0005540741 podman[100175]: 2025-12-01 09:15:38.071260293 +0000 UTC m=+0.122450067 container start c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43 (image=quay.io/ceph/ceph:v18, name=confident_chatelet, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  1 04:15:38 np0005540741 podman[100175]: 2025-12-01 09:15:38.074148905 +0000 UTC m=+0.125338709 container attach c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43 (image=quay.io/ceph/ceph:v18, name=confident_chatelet, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:15:38 np0005540741 systemd[1]: Started libpod-conmon-057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82.scope.
Dec  1 04:15:38 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b5528a13e77310e2f767264d5587e5ec2e1c72d33e8c2f68b21550d5e61de23/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b5528a13e77310e2f767264d5587e5ec2e1c72d33e8c2f68b21550d5e61de23/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b5528a13e77310e2f767264d5587e5ec2e1c72d33e8c2f68b21550d5e61de23/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b5528a13e77310e2f767264d5587e5ec2e1c72d33e8c2f68b21550d5e61de23/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:38 np0005540741 podman[100195]: 2025-12-01 09:15:38.046791516 +0000 UTC m=+0.025721547 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:38 np0005540741 podman[100195]: 2025-12-01 09:15:38.153149241 +0000 UTC m=+0.132079272 container init 057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_perlman, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True)
Dec  1 04:15:38 np0005540741 podman[100195]: 2025-12-01 09:15:38.161130355 +0000 UTC m=+0.140060366 container start 057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_perlman, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:15:38 np0005540741 podman[100195]: 2025-12-01 09:15:38.164358797 +0000 UTC m=+0.143288828 container attach 057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_perlman, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:38 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Dec  1 04:15:38 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Dec  1 04:15:38 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json"} v 0) v1
Dec  1 04:15:38 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3091290908' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec  1 04:15:38 np0005540741 confident_chatelet[100202]: 
Dec  1 04:15:38 np0005540741 confident_chatelet[100202]: {"fsid":"5620a9fb-e540-5250-a0e8-7aaad5347e3b","health":{"status":"HEALTH_OK","checks":{},"mutes":[]},"election_epoch":5,"quorum":[0],"quorum_names":["compute-0"],"quorum_age":195,"monmap":{"epoch":1,"min_mon_release_name":"reef","num_mons":1},"osdmap":{"epoch":47,"num_osds":3,"num_up_osds":3,"osd_up_since":1764580474,"num_in_osds":3,"osd_in_since":1764580437,"num_remapped_pgs":0},"pgmap":{"pgs_by_state":[{"state_name":"active+clean","count":193}],"num_pgs":193,"num_pools":7,"num_objects":23,"data_bytes":461642,"bytes_used":84344832,"bytes_avail":64327581696,"bytes_total":64411926528,"write_bytes_sec":1465,"read_op_per_sec":0,"write_op_per_sec":4},"fsmap":{"epoch":5,"id":1,"up":1,"in":1,"max":1,"by_rank":[{"filesystem_id":1,"rank":0,"name":"cephfs.compute-0.hrlhzj","status":"up:active","gid":14254}],"up:standby":0},"mgrmap":{"available":true,"num_standbys":0,"modules":["cephadm","iostat","nfs","restful"],"services":{}},"servicemap":{"epoch":5,"modified":"2025-12-01T09:15:37.039828+0000","services":{"mds":{"daemons":{"summary":"","cephfs.compute-0.hrlhzj":{"start_epoch":0,"start_stamp":"0.000000","gid":0,"addr":"(unrecognized address family 0)/0","metadata":{},"task_status":{}}}}}},"progress_events":{}}
Dec  1 04:15:38 np0005540741 systemd[1]: libpod-c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43.scope: Deactivated successfully.
Dec  1 04:15:38 np0005540741 podman[100175]: 2025-12-01 09:15:38.689247143 +0000 UTC m=+0.740436927 container died c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43 (image=quay.io/ceph/ceph:v18, name=confident_chatelet, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:38 np0005540741 systemd[1]: var-lib-containers-storage-overlay-3994441c15770f7bcac96802baa5b10446392f18724c28602e22f1b1d11c642c-merged.mount: Deactivated successfully.
Dec  1 04:15:38 np0005540741 podman[100175]: 2025-12-01 09:15:38.756724444 +0000 UTC m=+0.807914218 container remove c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43 (image=quay.io/ceph/ceph:v18, name=confident_chatelet, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 04:15:38 np0005540741 systemd[1]: libpod-conmon-c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43.scope: Deactivated successfully.
Dec  1 04:15:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v111: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s wr, 4 op/s
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]: {
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:        "osd_id": 0,
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:        "type": "bluestore"
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:    },
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:        "osd_id": 1,
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:        "type": "bluestore"
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:    },
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:        "osd_id": 2,
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:        "type": "bluestore"
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]:    }
Dec  1 04:15:39 np0005540741 optimistic_perlman[100215]: }
Dec  1 04:15:39 np0005540741 systemd[1]: libpod-057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82.scope: Deactivated successfully.
Dec  1 04:15:39 np0005540741 podman[100195]: 2025-12-01 09:15:39.179333904 +0000 UTC m=+1.158263915 container died 057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_perlman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:15:39 np0005540741 systemd[1]: libpod-057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82.scope: Consumed 1.023s CPU time.
Dec  1 04:15:39 np0005540741 systemd[1]: var-lib-containers-storage-overlay-0b5528a13e77310e2f767264d5587e5ec2e1c72d33e8c2f68b21550d5e61de23-merged.mount: Deactivated successfully.
Dec  1 04:15:39 np0005540741 podman[100195]: 2025-12-01 09:15:39.229183486 +0000 UTC m=+1.208113497 container remove 057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_perlman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:15:39 np0005540741 systemd[1]: libpod-conmon-057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82.scope: Deactivated successfully.
Dec  1 04:15:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:15:39 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:15:39 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:39 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 4d4eb745-8d43-41d0-a7d8-61fd31f97c42 does not exist
Dec  1 04:15:39 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.16 scrub starts
Dec  1 04:15:39 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Dec  1 04:15:39 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.16 scrub ok
Dec  1 04:15:39 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Dec  1 04:15:39 np0005540741 python3[100442]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   config dump -f json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:39 np0005540741 podman[100470]: 2025-12-01 09:15:39.7288301 +0000 UTC m=+0.023925191 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:39 np0005540741 podman[100470]: 2025-12-01 09:15:39.825399814 +0000 UTC m=+0.120494885 container create 79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a (image=quay.io/ceph/ceph:v18, name=focused_meitner, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:15:39 np0005540741 systemd[1]: Started libpod-conmon-79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a.scope.
Dec  1 04:15:39 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:39 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50d0cb215a9f1f88f31a19feb60595f6feffcaae4286670be28f48e2a1861436/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:39 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50d0cb215a9f1f88f31a19feb60595f6feffcaae4286670be28f48e2a1861436/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:39 np0005540741 podman[100470]: 2025-12-01 09:15:39.910792514 +0000 UTC m=+0.205887605 container init 79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a (image=quay.io/ceph/ceph:v18, name=focused_meitner, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  1 04:15:39 np0005540741 podman[100470]: 2025-12-01 09:15:39.918649503 +0000 UTC m=+0.213744594 container start 79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a (image=quay.io/ceph/ceph:v18, name=focused_meitner, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:39 np0005540741 podman[100470]: 2025-12-01 09:15:39.921639458 +0000 UTC m=+0.216734559 container attach 79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a (image=quay.io/ceph/ceph:v18, name=focused_meitner, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:39 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:39 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:40 np0005540741 podman[100558]: 2025-12-01 09:15:40.198338258 +0000 UTC m=+0.051627999 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec  1 04:15:40 np0005540741 podman[100558]: 2025-12-01 09:15:40.31531337 +0000 UTC m=+0.168603091 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) v1
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2956228905' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec  1 04:15:40 np0005540741 focused_meitner[100498]: 
Dec  1 04:15:40 np0005540741 focused_meitner[100498]: [{"section":"global","name":"cluster_network","value":"172.20.0.0/24","level":"advanced","can_update_at_runtime":false,"mask":""},{"section":"global","name":"container_image","value":"quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0","level":"basic","can_update_at_runtime":false,"mask":""},{"section":"global","name":"log_to_file","value":"true","level":"basic","can_update_at_runtime":true,"mask":""},{"section":"global","name":"mon_cluster_log_to_file","value":"true","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"global","name":"ms_bind_ipv4","value":"true","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"global","name":"ms_bind_ipv6","value":"false","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"global","name":"osd_pool_default_size","value":"1","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"global","name":"public_network","value":"192.168.122.0/24","level":"advanced","can_update_at_runtime":false,"mask":""},{"section":"mon","name":"auth_allow_insecure_global_id_reclaim","value":"false","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"mon","name":"mon_warn_on_pool_no_redundancy","value":"false","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"mgr","name":"mgr/cephadm/container_init","value":"True","level":"advanced","can_update_at_runtime":false,"mask":""},{"section":"mgr","name":"mgr/cephadm/migration_current","value":"6","level":"advanced","can_update_at_runtime":false,"mask":""},{"section":"mgr","name":"mgr/cephadm/use_repo_digest","value":"false","level":"advanced","can_update_at_runtime":false,"mask":""},{"section":"mgr","name":"mgr/orchestrator/orchestrator","value":"cephadm","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"mgr","name":"mgr_standby_modules","value":"false","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"osd","name":"osd_memory_target_autotune","value":"true","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"mds.cephfs","name":"mds_join_fs","value":"cephfs","level":"basic","can_update_at_runtime":true,"mask":""}]
Dec  1 04:15:40 np0005540741 systemd[1]: libpod-79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a.scope: Deactivated successfully.
Dec  1 04:15:40 np0005540741 podman[100470]: 2025-12-01 09:15:40.506269099 +0000 UTC m=+0.801364180 container died 79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a (image=quay.io/ceph/ceph:v18, name=focused_meitner, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 04:15:40 np0005540741 systemd[1]: var-lib-containers-storage-overlay-50d0cb215a9f1f88f31a19feb60595f6feffcaae4286670be28f48e2a1861436-merged.mount: Deactivated successfully.
Dec  1 04:15:40 np0005540741 podman[100470]: 2025-12-01 09:15:40.553423575 +0000 UTC m=+0.848518646 container remove 79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a (image=quay.io/ceph/ceph:v18, name=focused_meitner, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:15:40 np0005540741 systemd[1]: libpod-conmon-79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a.scope: Deactivated successfully.
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:40 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 33b134cc-0536-415d-802e-98a63cbf16eb does not exist
Dec  1 04:15:40 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 348d8149-25b0-4f42-825a-54ca2e37b828 does not exist
Dec  1 04:15:40 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 0da4a8eb-855c-41d9-b84d-512a601f067a does not exist
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:40 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:15:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v112: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec  1 04:15:41 np0005540741 podman[100890]: 2025-12-01 09:15:41.382623057 +0000 UTC m=+0.040450924 container create e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_kapitsa, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Dec  1 04:15:41 np0005540741 systemd[1]: Started libpod-conmon-e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b.scope.
Dec  1 04:15:41 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:41 np0005540741 podman[100890]: 2025-12-01 09:15:41.461019075 +0000 UTC m=+0.118846972 container init e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_kapitsa, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec  1 04:15:41 np0005540741 podman[100890]: 2025-12-01 09:15:41.364852544 +0000 UTC m=+0.022680441 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:41 np0005540741 podman[100890]: 2025-12-01 09:15:41.466672754 +0000 UTC m=+0.124500621 container start e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_kapitsa, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  1 04:15:41 np0005540741 stoic_kapitsa[100915]: 167 167
Dec  1 04:15:41 np0005540741 podman[100890]: 2025-12-01 09:15:41.469741602 +0000 UTC m=+0.127569499 container attach e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_kapitsa, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:41 np0005540741 systemd[1]: libpod-e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b.scope: Deactivated successfully.
Dec  1 04:15:41 np0005540741 podman[100890]: 2025-12-01 09:15:41.474047689 +0000 UTC m=+0.131875566 container died e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_kapitsa, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:15:41 np0005540741 python3[100909]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd get-require-min-compat-client _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:41 np0005540741 systemd[1]: var-lib-containers-storage-overlay-9c143e1b901c5bd69e14722c71a5a5a151ca74968e02dab646f3fb177d68e208-merged.mount: Deactivated successfully.
Dec  1 04:15:41 np0005540741 podman[100890]: 2025-12-01 09:15:41.510420363 +0000 UTC m=+0.168248230 container remove e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_kapitsa, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:41 np0005540741 systemd[1]: libpod-conmon-e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b.scope: Deactivated successfully.
Dec  1 04:15:41 np0005540741 podman[100927]: 2025-12-01 09:15:41.56137919 +0000 UTC m=+0.044228345 container create 92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73 (image=quay.io/ceph/ceph:v18, name=romantic_black, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:15:41 np0005540741 systemd[1]: Started libpod-conmon-92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73.scope.
Dec  1 04:15:41 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:41 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b72ad21ba273ee8d15db9d297e94666334948956b6b1fa51533a9559d4bd057/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:41 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b72ad21ba273ee8d15db9d297e94666334948956b6b1fa51533a9559d4bd057/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:41 np0005540741 podman[100927]: 2025-12-01 09:15:41.627437686 +0000 UTC m=+0.110286871 container init 92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73 (image=quay.io/ceph/ceph:v18, name=romantic_black, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  1 04:15:41 np0005540741 podman[100927]: 2025-12-01 09:15:41.634066886 +0000 UTC m=+0.116916041 container start 92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73 (image=quay.io/ceph/ceph:v18, name=romantic_black, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef)
Dec  1 04:15:41 np0005540741 podman[100927]: 2025-12-01 09:15:41.544153863 +0000 UTC m=+0.027003038 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:41 np0005540741 podman[100927]: 2025-12-01 09:15:41.63859272 +0000 UTC m=+0.121441875 container attach 92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73 (image=quay.io/ceph/ceph:v18, name=romantic_black, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True)
Dec  1 04:15:41 np0005540741 podman[100959]: 2025-12-01 09:15:41.686459539 +0000 UTC m=+0.047264921 container create c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_lamarr, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec  1 04:15:41 np0005540741 systemd[1]: Started libpod-conmon-c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9.scope.
Dec  1 04:15:41 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:41 np0005540741 podman[100959]: 2025-12-01 09:15:41.662964093 +0000 UTC m=+0.023769505 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:41 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fee8141e725dd613c372bb53a33b48d6f1a9b3996ca06e0ed53d8fe860a9000/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:41 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fee8141e725dd613c372bb53a33b48d6f1a9b3996ca06e0ed53d8fe860a9000/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:41 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fee8141e725dd613c372bb53a33b48d6f1a9b3996ca06e0ed53d8fe860a9000/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:41 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fee8141e725dd613c372bb53a33b48d6f1a9b3996ca06e0ed53d8fe860a9000/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:41 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fee8141e725dd613c372bb53a33b48d6f1a9b3996ca06e0ed53d8fe860a9000/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:41 np0005540741 podman[100959]: 2025-12-01 09:15:41.791737189 +0000 UTC m=+0.152542581 container init c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_lamarr, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 04:15:41 np0005540741 podman[100959]: 2025-12-01 09:15:41.799100323 +0000 UTC m=+0.159905705 container start c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_lamarr, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec  1 04:15:41 np0005540741 podman[100959]: 2025-12-01 09:15:41.802561343 +0000 UTC m=+0.163366745 container attach c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_lamarr, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:15:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd get-require-min-compat-client"} v 0) v1
Dec  1 04:15:42 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/951861402' entity='client.admin' cmd=[{"prefix": "osd get-require-min-compat-client"}]: dispatch
Dec  1 04:15:42 np0005540741 romantic_black[100951]: mimic
Dec  1 04:15:42 np0005540741 systemd[1]: libpod-92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73.scope: Deactivated successfully.
Dec  1 04:15:42 np0005540741 podman[101003]: 2025-12-01 09:15:42.274907871 +0000 UTC m=+0.025274753 container died 92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73 (image=quay.io/ceph/ceph:v18, name=romantic_black, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec  1 04:15:42 np0005540741 systemd[1]: var-lib-containers-storage-overlay-2b72ad21ba273ee8d15db9d297e94666334948956b6b1fa51533a9559d4bd057-merged.mount: Deactivated successfully.
Dec  1 04:15:42 np0005540741 podman[101003]: 2025-12-01 09:15:42.330668331 +0000 UTC m=+0.081035183 container remove 92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73 (image=quay.io/ceph/ceph:v18, name=romantic_black, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec  1 04:15:42 np0005540741 systemd[1]: libpod-conmon-92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73.scope: Deactivated successfully.
Dec  1 04:15:42 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1a scrub starts
Dec  1 04:15:42 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1a scrub ok
Dec  1 04:15:42 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.17 scrub starts
Dec  1 04:15:42 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.17 scrub ok
Dec  1 04:15:42 np0005540741 interesting_lamarr[100977]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:15:42 np0005540741 interesting_lamarr[100977]: --> relative data size: 1.0
Dec  1 04:15:42 np0005540741 interesting_lamarr[100977]: --> All data devices are unavailable
Dec  1 04:15:42 np0005540741 systemd[1]: libpod-c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9.scope: Deactivated successfully.
Dec  1 04:15:42 np0005540741 systemd[1]: libpod-c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9.scope: Consumed 1.067s CPU time.
Dec  1 04:15:42 np0005540741 podman[100959]: 2025-12-01 09:15:42.939005184 +0000 UTC m=+1.299810576 container died c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_lamarr, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  1 04:15:42 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:15:42 np0005540741 systemd[1]: var-lib-containers-storage-overlay-4fee8141e725dd613c372bb53a33b48d6f1a9b3996ca06e0ed53d8fe860a9000-merged.mount: Deactivated successfully.
Dec  1 04:15:43 np0005540741 podman[100959]: 2025-12-01 09:15:43.001354642 +0000 UTC m=+1.362160074 container remove c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_lamarr, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec  1 04:15:43 np0005540741 systemd[1]: libpod-conmon-c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9.scope: Deactivated successfully.
Dec  1 04:15:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:15:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:15:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v113: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec  1 04:15:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:15:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:15:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:15:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:15:43 np0005540741 python3[101127]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   versions -f json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:15:43 np0005540741 podman[101181]: 2025-12-01 09:15:43.360285311 +0000 UTC m=+0.050261976 container create d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2 (image=quay.io/ceph/ceph:v18, name=upbeat_bohr, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:15:43 np0005540741 systemd[1]: Started libpod-conmon-d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2.scope.
Dec  1 04:15:43 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:43 np0005540741 podman[101181]: 2025-12-01 09:15:43.335739552 +0000 UTC m=+0.025716237 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec  1 04:15:43 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ee2b9c64821ba316465b4e7d4ba7fa9e75fe7d0d1acb6d33683fb83290c3149/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:43 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ee2b9c64821ba316465b4e7d4ba7fa9e75fe7d0d1acb6d33683fb83290c3149/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:43 np0005540741 podman[101181]: 2025-12-01 09:15:43.445774684 +0000 UTC m=+0.135751379 container init d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2 (image=quay.io/ceph/ceph:v18, name=upbeat_bohr, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec  1 04:15:43 np0005540741 podman[101181]: 2025-12-01 09:15:43.453017894 +0000 UTC m=+0.142994569 container start d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2 (image=quay.io/ceph/ceph:v18, name=upbeat_bohr, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:43 np0005540741 podman[101181]: 2025-12-01 09:15:43.457747504 +0000 UTC m=+0.147724189 container attach d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2 (image=quay.io/ceph/ceph:v18, name=upbeat_bohr, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:43 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Dec  1 04:15:43 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Dec  1 04:15:43 np0005540741 podman[101240]: 2025-12-01 09:15:43.62779756 +0000 UTC m=+0.037427239 container create c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_roentgen, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  1 04:15:43 np0005540741 systemd[1]: Started libpod-conmon-c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5.scope.
Dec  1 04:15:43 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:43 np0005540741 podman[101240]: 2025-12-01 09:15:43.611637087 +0000 UTC m=+0.021266786 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:43 np0005540741 podman[101240]: 2025-12-01 09:15:43.712984353 +0000 UTC m=+0.122614072 container init c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_roentgen, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec  1 04:15:43 np0005540741 podman[101240]: 2025-12-01 09:15:43.72264507 +0000 UTC m=+0.132274759 container start c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_roentgen, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:15:43 np0005540741 intelligent_roentgen[101256]: 167 167
Dec  1 04:15:43 np0005540741 podman[101240]: 2025-12-01 09:15:43.726386418 +0000 UTC m=+0.136016137 container attach c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_roentgen, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True)
Dec  1 04:15:43 np0005540741 systemd[1]: libpod-c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5.scope: Deactivated successfully.
Dec  1 04:15:43 np0005540741 podman[101240]: 2025-12-01 09:15:43.727632468 +0000 UTC m=+0.137262167 container died c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_roentgen, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:15:43 np0005540741 systemd[1]: var-lib-containers-storage-overlay-6f7388fa5e5883bbf2d80d7669fc54258c873712ed6e250f5d09dd17d663eafe-merged.mount: Deactivated successfully.
Dec  1 04:15:43 np0005540741 podman[101240]: 2025-12-01 09:15:43.768935249 +0000 UTC m=+0.178564928 container remove c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_roentgen, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:15:43 np0005540741 systemd[1]: libpod-conmon-c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5.scope: Deactivated successfully.
Dec  1 04:15:43 np0005540741 podman[101299]: 2025-12-01 09:15:43.924568197 +0000 UTC m=+0.041066464 container create 18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_williams, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 04:15:43 np0005540741 systemd[1]: Started libpod-conmon-18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6.scope.
Dec  1 04:15:43 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:43 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49daba2b0a33257a9368df698cc66aeafeb40360dcc9dda283632220b4353201/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:43 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49daba2b0a33257a9368df698cc66aeafeb40360dcc9dda283632220b4353201/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:43 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49daba2b0a33257a9368df698cc66aeafeb40360dcc9dda283632220b4353201/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:43 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49daba2b0a33257a9368df698cc66aeafeb40360dcc9dda283632220b4353201/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:43 np0005540741 podman[101299]: 2025-12-01 09:15:43.998145582 +0000 UTC m=+0.114643889 container init 18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_williams, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:15:44 np0005540741 podman[101299]: 2025-12-01 09:15:43.90764011 +0000 UTC m=+0.024138397 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:44 np0005540741 podman[101299]: 2025-12-01 09:15:44.005533226 +0000 UTC m=+0.122031493 container start 18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_williams, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True)
Dec  1 04:15:44 np0005540741 podman[101299]: 2025-12-01 09:15:44.009945216 +0000 UTC m=+0.126443483 container attach 18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_williams, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec  1 04:15:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "versions", "format": "json"} v 0) v1
Dec  1 04:15:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1401116377' entity='client.admin' cmd=[{"prefix": "versions", "format": "json"}]: dispatch
Dec  1 04:15:44 np0005540741 upbeat_bohr[101196]: 
Dec  1 04:15:44 np0005540741 systemd[1]: libpod-d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2.scope: Deactivated successfully.
Dec  1 04:15:44 np0005540741 upbeat_bohr[101196]: {"mon":{"ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable)":1},"mgr":{"ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable)":1},"osd":{"ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable)":3},"mds":{"ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable)":1},"overall":{"ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable)":6}}
Dec  1 04:15:44 np0005540741 podman[101181]: 2025-12-01 09:15:44.106214801 +0000 UTC m=+0.796191466 container died d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2 (image=quay.io/ceph/ceph:v18, name=upbeat_bohr, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:15:44 np0005540741 systemd[1]: var-lib-containers-storage-overlay-0ee2b9c64821ba316465b4e7d4ba7fa9e75fe7d0d1acb6d33683fb83290c3149-merged.mount: Deactivated successfully.
Dec  1 04:15:44 np0005540741 podman[101181]: 2025-12-01 09:15:44.150734544 +0000 UTC m=+0.840711209 container remove d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2 (image=quay.io/ceph/ceph:v18, name=upbeat_bohr, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:15:44 np0005540741 systemd[1]: libpod-conmon-d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2.scope: Deactivated successfully.
Dec  1 04:15:44 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1e scrub starts
Dec  1 04:15:44 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1e scrub ok
Dec  1 04:15:44 np0005540741 admiring_williams[101316]: {
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:    "0": [
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:        {
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "devices": [
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "/dev/loop3"
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            ],
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "lv_name": "ceph_lv0",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "lv_size": "21470642176",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "name": "ceph_lv0",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "tags": {
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.cluster_name": "ceph",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.crush_device_class": "",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.encrypted": "0",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.osd_id": "0",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.type": "block",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.vdo": "0"
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            },
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "type": "block",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "vg_name": "ceph_vg0"
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:        }
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:    ],
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:    "1": [
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:        {
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "devices": [
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "/dev/loop4"
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            ],
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "lv_name": "ceph_lv1",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "lv_size": "21470642176",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "name": "ceph_lv1",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "tags": {
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.cluster_name": "ceph",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.crush_device_class": "",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.encrypted": "0",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.osd_id": "1",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.type": "block",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.vdo": "0"
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            },
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "type": "block",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "vg_name": "ceph_vg1"
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:        }
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:    ],
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:    "2": [
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:        {
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "devices": [
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "/dev/loop5"
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            ],
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "lv_name": "ceph_lv2",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "lv_size": "21470642176",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "name": "ceph_lv2",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "tags": {
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.cluster_name": "ceph",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.crush_device_class": "",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.encrypted": "0",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.osd_id": "2",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.type": "block",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:                "ceph.vdo": "0"
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            },
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "type": "block",
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:            "vg_name": "ceph_vg2"
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:        }
Dec  1 04:15:44 np0005540741 admiring_williams[101316]:    ]
Dec  1 04:15:44 np0005540741 admiring_williams[101316]: }
Dec  1 04:15:44 np0005540741 systemd[1]: libpod-18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6.scope: Deactivated successfully.
Dec  1 04:15:44 np0005540741 podman[101339]: 2025-12-01 09:15:44.8543031 +0000 UTC m=+0.023410624 container died 18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_williams, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec  1 04:15:44 np0005540741 systemd[1]: var-lib-containers-storage-overlay-49daba2b0a33257a9368df698cc66aeafeb40360dcc9dda283632220b4353201-merged.mount: Deactivated successfully.
Dec  1 04:15:44 np0005540741 podman[101339]: 2025-12-01 09:15:44.90220911 +0000 UTC m=+0.071316624 container remove 18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_williams, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:15:44 np0005540741 systemd[1]: libpod-conmon-18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6.scope: Deactivated successfully.
Dec  1 04:15:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v114: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s wr, 2 op/s
Dec  1 04:15:45 np0005540741 podman[101493]: 2025-12-01 09:15:45.473951742 +0000 UTC m=+0.039387470 container create 368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_chatelet, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec  1 04:15:45 np0005540741 systemd[1]: Started libpod-conmon-368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6.scope.
Dec  1 04:15:45 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:45 np0005540741 podman[101493]: 2025-12-01 09:15:45.549039415 +0000 UTC m=+0.114475163 container init 368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_chatelet, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:15:45 np0005540741 podman[101493]: 2025-12-01 09:15:45.456899011 +0000 UTC m=+0.022334759 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:45 np0005540741 podman[101493]: 2025-12-01 09:15:45.55486616 +0000 UTC m=+0.120301888 container start 368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_chatelet, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507)
Dec  1 04:15:45 np0005540741 podman[101493]: 2025-12-01 09:15:45.55834557 +0000 UTC m=+0.123781298 container attach 368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_chatelet, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:15:45 np0005540741 priceless_chatelet[101509]: 167 167
Dec  1 04:15:45 np0005540741 systemd[1]: libpod-368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6.scope: Deactivated successfully.
Dec  1 04:15:45 np0005540741 podman[101493]: 2025-12-01 09:15:45.559969512 +0000 UTC m=+0.125405240 container died 368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_chatelet, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:15:45 np0005540741 systemd[1]: var-lib-containers-storage-overlay-cf16fc7b86721153131d516e223146f8627eeda081d424595ee8eb12ab141250-merged.mount: Deactivated successfully.
Dec  1 04:15:45 np0005540741 podman[101493]: 2025-12-01 09:15:45.594512398 +0000 UTC m=+0.159948126 container remove 368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_chatelet, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  1 04:15:45 np0005540741 systemd[1]: libpod-conmon-368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6.scope: Deactivated successfully.
Dec  1 04:15:45 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Dec  1 04:15:45 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Dec  1 04:15:45 np0005540741 podman[101532]: 2025-12-01 09:15:45.748621098 +0000 UTC m=+0.040105154 container create 86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_hugle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec  1 04:15:45 np0005540741 systemd[1]: Started libpod-conmon-86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957.scope.
Dec  1 04:15:45 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:15:45 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03edf129755e443dd4d0fe98c37c00e242cbc121c8c5a7856402ec876cb233bb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:45 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03edf129755e443dd4d0fe98c37c00e242cbc121c8c5a7856402ec876cb233bb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:45 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03edf129755e443dd4d0fe98c37c00e242cbc121c8c5a7856402ec876cb233bb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:45 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03edf129755e443dd4d0fe98c37c00e242cbc121c8c5a7856402ec876cb233bb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:15:45 np0005540741 podman[101532]: 2025-12-01 09:15:45.827657066 +0000 UTC m=+0.119141142 container init 86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_hugle, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:15:45 np0005540741 podman[101532]: 2025-12-01 09:15:45.732173236 +0000 UTC m=+0.023657312 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:15:45 np0005540741 podman[101532]: 2025-12-01 09:15:45.837806638 +0000 UTC m=+0.129290694 container start 86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_hugle, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Dec  1 04:15:45 np0005540741 podman[101532]: 2025-12-01 09:15:45.840816084 +0000 UTC m=+0.132300140 container attach 86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_hugle, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec  1 04:15:46 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Dec  1 04:15:46 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Dec  1 04:15:46 np0005540741 strange_hugle[101549]: {
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:        "osd_id": 0,
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:        "type": "bluestore"
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:    },
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:        "osd_id": 1,
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:        "type": "bluestore"
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:    },
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:        "osd_id": 2,
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:        "type": "bluestore"
Dec  1 04:15:46 np0005540741 strange_hugle[101549]:    }
Dec  1 04:15:46 np0005540741 strange_hugle[101549]: }
Dec  1 04:15:46 np0005540741 systemd[1]: libpod-86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957.scope: Deactivated successfully.
Dec  1 04:15:46 np0005540741 podman[101532]: 2025-12-01 09:15:46.834482343 +0000 UTC m=+1.125966399 container died 86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_hugle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:15:46 np0005540741 systemd[1]: libpod-86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957.scope: Consumed 1.004s CPU time.
Dec  1 04:15:46 np0005540741 systemd[1]: var-lib-containers-storage-overlay-03edf129755e443dd4d0fe98c37c00e242cbc121c8c5a7856402ec876cb233bb-merged.mount: Deactivated successfully.
Dec  1 04:15:46 np0005540741 podman[101532]: 2025-12-01 09:15:46.892655539 +0000 UTC m=+1.184139595 container remove 86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_hugle, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Dec  1 04:15:46 np0005540741 systemd[1]: libpod-conmon-86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957.scope: Deactivated successfully.
Dec  1 04:15:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:15:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:15:46 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:46 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 9a0b826e-5c61-49b1-9445-7eec89f0ba24 does not exist
Dec  1 04:15:47 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:47 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:15:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v115: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s wr, 2 op/s
Dec  1 04:15:47 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.6 scrub starts
Dec  1 04:15:47 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.6 scrub ok
Dec  1 04:15:47 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:15:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v116: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 0 B/s wr, 0 op/s
Dec  1 04:15:49 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Dec  1 04:15:49 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Dec  1 04:15:49 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Dec  1 04:15:49 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.19 scrub ok
Dec  1 04:15:50 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.a scrub starts
Dec  1 04:15:50 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.a scrub ok
Dec  1 04:15:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v117: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:51 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.b scrub starts
Dec  1 04:15:51 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.b scrub ok
Dec  1 04:15:52 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1e scrub starts
Dec  1 04:15:52 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1e scrub ok
Dec  1 04:15:52 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:15:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v118: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:54 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.d scrub starts
Dec  1 04:15:54 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.d scrub ok
Dec  1 04:15:54 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Dec  1 04:15:54 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Dec  1 04:15:54 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Dec  1 04:15:54 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Dec  1 04:15:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v119: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:55 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Dec  1 04:15:55 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Dec  1 04:15:55 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.3 scrub starts
Dec  1 04:15:55 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.3 scrub ok
Dec  1 04:15:56 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.e scrub starts
Dec  1 04:15:56 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.e scrub ok
Dec  1 04:15:56 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.5 scrub starts
Dec  1 04:15:56 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.5 scrub ok
Dec  1 04:15:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v120: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:15:57 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.7 scrub starts
Dec  1 04:15:57 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.7 scrub ok
Dec  1 04:15:57 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Dec  1 04:15:57 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Dec  1 04:15:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:15:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v121: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:00 np0005540741 systemd-logind[788]: New session 34 of user zuul.
Dec  1 04:16:00 np0005540741 systemd[1]: Started Session 34 of User zuul.
Dec  1 04:16:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v122: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:01 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.b scrub starts
Dec  1 04:16:01 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.b scrub ok
Dec  1 04:16:01 np0005540741 python3.9[101799]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:16:01 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.9 scrub starts
Dec  1 04:16:01 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.9 scrub ok
Dec  1 04:16:02 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:16:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v123: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:03 np0005540741 python3.9[102017]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:16:04 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.a scrub starts
Dec  1 04:16:04 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.a scrub ok
Dec  1 04:16:04 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.10 deep-scrub starts
Dec  1 04:16:04 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.10 deep-scrub ok
Dec  1 04:16:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v124: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:05 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.10 scrub starts
Dec  1 04:16:05 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.10 scrub ok
Dec  1 04:16:06 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.d scrub starts
Dec  1 04:16:06 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.d scrub ok
Dec  1 04:16:06 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.17 scrub starts
Dec  1 04:16:06 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.17 scrub ok
Dec  1 04:16:06 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.12 scrub starts
Dec  1 04:16:06 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.12 scrub ok
Dec  1 04:16:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v125: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:07 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:16:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v126: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:09 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.10 deep-scrub starts
Dec  1 04:16:09 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.10 deep-scrub ok
Dec  1 04:16:09 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1b scrub starts
Dec  1 04:16:09 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1b scrub ok
Dec  1 04:16:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v127: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:11 np0005540741 systemd[1]: session-34.scope: Deactivated successfully.
Dec  1 04:16:11 np0005540741 systemd[1]: session-34.scope: Consumed 8.800s CPU time.
Dec  1 04:16:11 np0005540741 systemd-logind[788]: Session 34 logged out. Waiting for processes to exit.
Dec  1 04:16:11 np0005540741 systemd-logind[788]: Removed session 34.
Dec  1 04:16:12 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.16 scrub starts
Dec  1 04:16:12 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.16 scrub ok
Dec  1 04:16:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:16:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:16:12
Dec  1 04:16:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:16:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:16:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['cephfs.cephfs.meta', 'cephfs.cephfs.data', 'images', 'backups', 'vms', '.mgr', 'volumes']
Dec  1 04:16:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v128: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:16:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:16:13 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.18 scrub starts
Dec  1 04:16:13 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.18 scrub ok
Dec  1 04:16:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v129: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:15 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.12 deep-scrub starts
Dec  1 04:16:15 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.12 deep-scrub ok
Dec  1 04:16:15 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.19 scrub starts
Dec  1 04:16:15 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.19 scrub ok
Dec  1 04:16:16 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1c scrub starts
Dec  1 04:16:16 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1c scrub ok
Dec  1 04:16:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v130: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:17 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.1a deep-scrub starts
Dec  1 04:16:17 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Dec  1 04:16:17 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.1a deep-scrub ok
Dec  1 04:16:17 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Dec  1 04:16:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:16:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:16:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:16:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:16:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:16:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:16:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:16:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:16:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:16:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:16:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:16:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:16:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:16:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:16:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:16:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:16:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v131: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:19 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Dec  1 04:16:19 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Dec  1 04:16:20 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Dec  1 04:16:20 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Dec  1 04:16:20 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.1b scrub starts
Dec  1 04:16:20 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.1b scrub ok
Dec  1 04:16:20 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Dec  1 04:16:20 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Dec  1 04:16:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v132: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:21 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Dec  1 04:16:21 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Dec  1 04:16:22 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Dec  1 04:16:22 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Dec  1 04:16:22 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Dec  1 04:16:22 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Dec  1 04:16:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:16:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v133: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:24 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Dec  1 04:16:24 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Dec  1 04:16:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v134: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:26 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.17 scrub starts
Dec  1 04:16:26 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.17 scrub ok
Dec  1 04:16:26 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Dec  1 04:16:26 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Dec  1 04:16:26 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.f scrub starts
Dec  1 04:16:26 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.f scrub ok
Dec  1 04:16:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v135: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:27 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.19 scrub starts
Dec  1 04:16:27 np0005540741 systemd-logind[788]: New session 35 of user zuul.
Dec  1 04:16:27 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.19 scrub ok
Dec  1 04:16:27 np0005540741 systemd[1]: Started Session 35 of User zuul.
Dec  1 04:16:27 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.16 deep-scrub starts
Dec  1 04:16:27 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.16 deep-scrub ok
Dec  1 04:16:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:16:28 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Dec  1 04:16:28 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Dec  1 04:16:28 np0005540741 python3.9[102227]: ansible-ansible.legacy.ping Invoked with data=pong
Dec  1 04:16:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v136: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:29 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Dec  1 04:16:29 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Dec  1 04:16:30 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1e scrub starts
Dec  1 04:16:30 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1e scrub ok
Dec  1 04:16:30 np0005540741 python3.9[102401]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:16:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v137: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:31 np0005540741 python3.9[102557]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:16:32 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.14 scrub starts
Dec  1 04:16:32 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.14 scrub ok
Dec  1 04:16:32 np0005540741 python3.9[102710]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:16:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v138: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:33 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1e scrub starts
Dec  1 04:16:33 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1e scrub ok
Dec  1 04:16:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:16:33 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.15 deep-scrub starts
Dec  1 04:16:33 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.15 deep-scrub ok
Dec  1 04:16:33 np0005540741 python3.9[102864]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:16:33 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.e scrub starts
Dec  1 04:16:33 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.e scrub ok
Dec  1 04:16:34 np0005540741 python3.9[103016]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:16:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v139: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:35 np0005540741 python3.9[103166]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:16:35 np0005540741 network[103183]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:16:35 np0005540741 network[103184]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:16:35 np0005540741 network[103185]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:16:35 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.f scrub starts
Dec  1 04:16:35 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.f scrub ok
Dec  1 04:16:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v140: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:37 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Dec  1 04:16:37 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Dec  1 04:16:38 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Dec  1 04:16:38 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Dec  1 04:16:38 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Dec  1 04:16:38 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Dec  1 04:16:38 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:16:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v141: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:39 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Dec  1 04:16:39 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Dec  1 04:16:40 np0005540741 python3.9[103445]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:16:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v142: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:41 np0005540741 python3.9[103595]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:16:41 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Dec  1 04:16:41 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Dec  1 04:16:42 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Dec  1 04:16:42 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Dec  1 04:16:42 np0005540741 python3.9[103749]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:16:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:16:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:16:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:16:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:16:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:16:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:16:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v143: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:43 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Dec  1 04:16:43 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Dec  1 04:16:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:16:43 np0005540741 python3.9[103907]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:16:44 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Dec  1 04:16:44 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Dec  1 04:16:44 np0005540741 python3.9[103991]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:16:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v144: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:45 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.8 scrub starts
Dec  1 04:16:45 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.8 scrub ok
Dec  1 04:16:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v145: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:47 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.a scrub starts
Dec  1 04:16:47 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.a scrub ok
Dec  1 04:16:48 np0005540741 podman[104228]: 2025-12-01 09:16:48.242558863 +0000 UTC m=+0.478844759 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Dec  1 04:16:48 np0005540741 podman[104228]: 2025-12-01 09:16:48.338715168 +0000 UTC m=+0.575001044 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec  1 04:16:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:16:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:16:48 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:16:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:16:48 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:16:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v146: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:49 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:16:49 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:16:49 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Dec  1 04:16:49 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Dec  1 04:16:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:16:49 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:16:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:16:49 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:16:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:16:49 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:16:49 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 2e9f07b2-5977-43a9-93f8-19beb8bb89ba does not exist
Dec  1 04:16:49 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 62e2a1ab-02f6-460e-85f3-0507d216486a does not exist
Dec  1 04:16:49 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 56b3bf1c-3b66-49d9-966c-fe45e73b3855 does not exist
Dec  1 04:16:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:16:49 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:16:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:16:49 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:16:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:16:49 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:16:50 np0005540741 podman[104647]: 2025-12-01 09:16:50.136703191 +0000 UTC m=+0.038231348 container create fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_chandrasekhar, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:16:50 np0005540741 systemd[1]: Started libpod-conmon-fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc.scope.
Dec  1 04:16:50 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:16:50 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:16:50 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:16:50 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:16:50 np0005540741 podman[104647]: 2025-12-01 09:16:50.120246403 +0000 UTC m=+0.021774580 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:16:50 np0005540741 podman[104647]: 2025-12-01 09:16:50.221946856 +0000 UTC m=+0.123475033 container init fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_chandrasekhar, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec  1 04:16:50 np0005540741 podman[104647]: 2025-12-01 09:16:50.230040746 +0000 UTC m=+0.131568903 container start fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_chandrasekhar, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:16:50 np0005540741 podman[104647]: 2025-12-01 09:16:50.233887915 +0000 UTC m=+0.135416102 container attach fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_chandrasekhar, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:16:50 np0005540741 boring_chandrasekhar[104664]: 167 167
Dec  1 04:16:50 np0005540741 systemd[1]: libpod-fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc.scope: Deactivated successfully.
Dec  1 04:16:50 np0005540741 podman[104647]: 2025-12-01 09:16:50.23826111 +0000 UTC m=+0.139789267 container died fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_chandrasekhar, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:16:50 np0005540741 systemd[1]: var-lib-containers-storage-overlay-87e0525d1a5c733159722fb2a1db49d5bab1f5d77bdc7ecf59b145ce89bd2ac8-merged.mount: Deactivated successfully.
Dec  1 04:16:50 np0005540741 podman[104647]: 2025-12-01 09:16:50.276222679 +0000 UTC m=+0.177750836 container remove fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_chandrasekhar, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:16:50 np0005540741 systemd[1]: libpod-conmon-fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc.scope: Deactivated successfully.
Dec  1 04:16:50 np0005540741 podman[104688]: 2025-12-01 09:16:50.446556463 +0000 UTC m=+0.062243421 container create 0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poitras, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:16:50 np0005540741 systemd[1]: Started libpod-conmon-0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6.scope.
Dec  1 04:16:50 np0005540741 podman[104688]: 2025-12-01 09:16:50.41656934 +0000 UTC m=+0.032256368 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:16:50 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:16:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3248251a3969ae7c03ae3d850d86ad608380a03b8474a128dcc5a3c731260f78/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:16:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3248251a3969ae7c03ae3d850d86ad608380a03b8474a128dcc5a3c731260f78/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:16:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3248251a3969ae7c03ae3d850d86ad608380a03b8474a128dcc5a3c731260f78/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:16:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3248251a3969ae7c03ae3d850d86ad608380a03b8474a128dcc5a3c731260f78/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:16:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3248251a3969ae7c03ae3d850d86ad608380a03b8474a128dcc5a3c731260f78/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:16:50 np0005540741 podman[104688]: 2025-12-01 09:16:50.534410982 +0000 UTC m=+0.150097970 container init 0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poitras, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:16:50 np0005540741 podman[104688]: 2025-12-01 09:16:50.541307208 +0000 UTC m=+0.156994126 container start 0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poitras, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:16:50 np0005540741 podman[104688]: 2025-12-01 09:16:50.544731185 +0000 UTC m=+0.160418193 container attach 0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poitras, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  1 04:16:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v147: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:51 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.b scrub starts
Dec  1 04:16:51 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.b scrub ok
Dec  1 04:16:51 np0005540741 silly_poitras[104705]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:16:51 np0005540741 silly_poitras[104705]: --> relative data size: 1.0
Dec  1 04:16:51 np0005540741 silly_poitras[104705]: --> All data devices are unavailable
Dec  1 04:16:51 np0005540741 systemd[1]: libpod-0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6.scope: Deactivated successfully.
Dec  1 04:16:51 np0005540741 podman[104688]: 2025-12-01 09:16:51.57802276 +0000 UTC m=+1.193709688 container died 0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poitras, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:16:51 np0005540741 systemd[1]: var-lib-containers-storage-overlay-3248251a3969ae7c03ae3d850d86ad608380a03b8474a128dcc5a3c731260f78-merged.mount: Deactivated successfully.
Dec  1 04:16:51 np0005540741 podman[104688]: 2025-12-01 09:16:51.636356599 +0000 UTC m=+1.252043517 container remove 0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poitras, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  1 04:16:51 np0005540741 systemd[1]: libpod-conmon-0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6.scope: Deactivated successfully.
Dec  1 04:16:52 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.f deep-scrub starts
Dec  1 04:16:52 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.f deep-scrub ok
Dec  1 04:16:52 np0005540741 podman[104884]: 2025-12-01 09:16:52.253677815 +0000 UTC m=+0.046889015 container create 5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_ritchie, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:16:52 np0005540741 systemd[1]: Started libpod-conmon-5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af.scope.
Dec  1 04:16:52 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:16:52 np0005540741 podman[104884]: 2025-12-01 09:16:52.324763336 +0000 UTC m=+0.117974556 container init 5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_ritchie, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec  1 04:16:52 np0005540741 podman[104884]: 2025-12-01 09:16:52.330972883 +0000 UTC m=+0.124184083 container start 5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_ritchie, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:16:52 np0005540741 podman[104884]: 2025-12-01 09:16:52.237100103 +0000 UTC m=+0.030311323 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:16:52 np0005540741 adoring_ritchie[104899]: 167 167
Dec  1 04:16:52 np0005540741 systemd[1]: libpod-5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af.scope: Deactivated successfully.
Dec  1 04:16:52 np0005540741 podman[104884]: 2025-12-01 09:16:52.345685361 +0000 UTC m=+0.138896561 container attach 5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_ritchie, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec  1 04:16:52 np0005540741 podman[104884]: 2025-12-01 09:16:52.346362751 +0000 UTC m=+0.139573961 container died 5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_ritchie, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  1 04:16:52 np0005540741 systemd[1]: var-lib-containers-storage-overlay-ee324b259928c7cded8e9ffe8c6b09e8c106a0f3f9753a8654adf8fcf98e8fb2-merged.mount: Deactivated successfully.
Dec  1 04:16:52 np0005540741 podman[104884]: 2025-12-01 09:16:52.380733488 +0000 UTC m=+0.173944688 container remove 5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_ritchie, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:16:52 np0005540741 systemd[1]: libpod-conmon-5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af.scope: Deactivated successfully.
Dec  1 04:16:52 np0005540741 podman[104922]: 2025-12-01 09:16:52.536042865 +0000 UTC m=+0.038082894 container create 55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_mirzakhani, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:16:52 np0005540741 systemd[1]: Started libpod-conmon-55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6.scope.
Dec  1 04:16:52 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:16:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2a48ce4fda5e118f4a811f4858503a33d1757c6f2c4acc1b074b5c305d10a54/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:16:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2a48ce4fda5e118f4a811f4858503a33d1757c6f2c4acc1b074b5c305d10a54/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:16:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2a48ce4fda5e118f4a811f4858503a33d1757c6f2c4acc1b074b5c305d10a54/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:16:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2a48ce4fda5e118f4a811f4858503a33d1757c6f2c4acc1b074b5c305d10a54/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:16:52 np0005540741 podman[104922]: 2025-12-01 09:16:52.595378772 +0000 UTC m=+0.097418891 container init 55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_mirzakhani, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec  1 04:16:52 np0005540741 podman[104922]: 2025-12-01 09:16:52.602563407 +0000 UTC m=+0.104603436 container start 55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_mirzakhani, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:16:52 np0005540741 podman[104922]: 2025-12-01 09:16:52.605611123 +0000 UTC m=+0.107651172 container attach 55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_mirzakhani, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  1 04:16:52 np0005540741 podman[104922]: 2025-12-01 09:16:52.520070191 +0000 UTC m=+0.022110250 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:16:52 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Dec  1 04:16:52 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Dec  1 04:16:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v148: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]: {
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:    "0": [
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:        {
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "devices": [
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "/dev/loop3"
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            ],
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "lv_name": "ceph_lv0",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "lv_size": "21470642176",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "name": "ceph_lv0",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "tags": {
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.cluster_name": "ceph",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.crush_device_class": "",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.encrypted": "0",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.osd_id": "0",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.type": "block",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.vdo": "0"
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            },
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "type": "block",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "vg_name": "ceph_vg0"
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:        }
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:    ],
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:    "1": [
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:        {
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "devices": [
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "/dev/loop4"
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            ],
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "lv_name": "ceph_lv1",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "lv_size": "21470642176",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "name": "ceph_lv1",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "tags": {
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.cluster_name": "ceph",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.crush_device_class": "",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.encrypted": "0",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.osd_id": "1",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.type": "block",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.vdo": "0"
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            },
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "type": "block",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "vg_name": "ceph_vg1"
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:        }
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:    ],
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:    "2": [
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:        {
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "devices": [
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "/dev/loop5"
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            ],
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "lv_name": "ceph_lv2",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "lv_size": "21470642176",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "name": "ceph_lv2",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "tags": {
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.cluster_name": "ceph",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.crush_device_class": "",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.encrypted": "0",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.osd_id": "2",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.type": "block",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:                "ceph.vdo": "0"
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            },
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "type": "block",
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:            "vg_name": "ceph_vg2"
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:        }
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]:    ]
Dec  1 04:16:53 np0005540741 trusting_mirzakhani[104938]: }
Dec  1 04:16:53 np0005540741 systemd[1]: libpod-55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6.scope: Deactivated successfully.
Dec  1 04:16:53 np0005540741 conmon[104938]: conmon 55d006748401b1455d62 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6.scope/container/memory.events
Dec  1 04:16:53 np0005540741 podman[104922]: 2025-12-01 09:16:53.340870503 +0000 UTC m=+0.842910532 container died 55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_mirzakhani, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:16:53 np0005540741 systemd[1]: var-lib-containers-storage-overlay-d2a48ce4fda5e118f4a811f4858503a33d1757c6f2c4acc1b074b5c305d10a54-merged.mount: Deactivated successfully.
Dec  1 04:16:53 np0005540741 podman[104922]: 2025-12-01 09:16:53.401842397 +0000 UTC m=+0.903882446 container remove 55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_mirzakhani, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:16:53 np0005540741 systemd[1]: libpod-conmon-55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6.scope: Deactivated successfully.
Dec  1 04:16:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:16:53 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.14 scrub starts
Dec  1 04:16:53 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.14 scrub ok
Dec  1 04:16:54 np0005540741 podman[105102]: 2025-12-01 09:16:54.001781259 +0000 UTC m=+0.048296714 container create a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhaskara, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:16:54 np0005540741 systemd[1]: Started libpod-conmon-a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6.scope.
Dec  1 04:16:54 np0005540741 podman[105102]: 2025-12-01 09:16:53.979619239 +0000 UTC m=+0.026134744 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:16:54 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:16:54 np0005540741 podman[105102]: 2025-12-01 09:16:54.089933606 +0000 UTC m=+0.136449061 container init a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhaskara, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:16:54 np0005540741 podman[105102]: 2025-12-01 09:16:54.097590144 +0000 UTC m=+0.144105599 container start a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhaskara, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:16:54 np0005540741 podman[105102]: 2025-12-01 09:16:54.101056762 +0000 UTC m=+0.147572247 container attach a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhaskara, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:16:54 np0005540741 systemd[1]: libpod-a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6.scope: Deactivated successfully.
Dec  1 04:16:54 np0005540741 admiring_bhaskara[105118]: 167 167
Dec  1 04:16:54 np0005540741 conmon[105118]: conmon a08071b70d525a2316a9 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6.scope/container/memory.events
Dec  1 04:16:54 np0005540741 podman[105102]: 2025-12-01 09:16:54.10413414 +0000 UTC m=+0.150649595 container died a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhaskara, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:16:54 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Dec  1 04:16:54 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Dec  1 04:16:54 np0005540741 systemd[1]: var-lib-containers-storage-overlay-58949cd145715e7c1ecb3eaf349e3f39b4899de6216dc16df083dd08b9efa073-merged.mount: Deactivated successfully.
Dec  1 04:16:54 np0005540741 podman[105102]: 2025-12-01 09:16:54.138690053 +0000 UTC m=+0.185205508 container remove a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhaskara, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:16:54 np0005540741 systemd[1]: libpod-conmon-a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6.scope: Deactivated successfully.
Dec  1 04:16:54 np0005540741 podman[105144]: 2025-12-01 09:16:54.293870026 +0000 UTC m=+0.038290090 container create 87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_heyrovsky, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:16:54 np0005540741 systemd[1]: Started libpod-conmon-87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9.scope.
Dec  1 04:16:54 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:16:54 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e34c327657f84cb9190bedb62a4a35481642ffc67503f0522f1c1bde4861971e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:16:54 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e34c327657f84cb9190bedb62a4a35481642ffc67503f0522f1c1bde4861971e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:16:54 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e34c327657f84cb9190bedb62a4a35481642ffc67503f0522f1c1bde4861971e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:16:54 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e34c327657f84cb9190bedb62a4a35481642ffc67503f0522f1c1bde4861971e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:16:54 np0005540741 podman[105144]: 2025-12-01 09:16:54.279655571 +0000 UTC m=+0.024075655 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:16:54 np0005540741 podman[105144]: 2025-12-01 09:16:54.378168523 +0000 UTC m=+0.122588597 container init 87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_heyrovsky, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:16:54 np0005540741 podman[105144]: 2025-12-01 09:16:54.387063816 +0000 UTC m=+0.131483880 container start 87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_heyrovsky, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec  1 04:16:54 np0005540741 podman[105144]: 2025-12-01 09:16:54.390695519 +0000 UTC m=+0.135115603 container attach 87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_heyrovsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec  1 04:16:54 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Dec  1 04:16:54 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Dec  1 04:16:54 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Dec  1 04:16:54 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Dec  1 04:16:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v149: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]: {
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:        "osd_id": 0,
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:        "type": "bluestore"
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:    },
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:        "osd_id": 1,
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:        "type": "bluestore"
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:    },
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:        "osd_id": 2,
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:        "type": "bluestore"
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]:    }
Dec  1 04:16:55 np0005540741 eloquent_heyrovsky[105160]: }
Dec  1 04:16:55 np0005540741 systemd[1]: libpod-87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9.scope: Deactivated successfully.
Dec  1 04:16:55 np0005540741 podman[105144]: 2025-12-01 09:16:55.378266844 +0000 UTC m=+1.122686918 container died 87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_heyrovsky, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec  1 04:16:55 np0005540741 systemd[1]: var-lib-containers-storage-overlay-e34c327657f84cb9190bedb62a4a35481642ffc67503f0522f1c1bde4861971e-merged.mount: Deactivated successfully.
Dec  1 04:16:55 np0005540741 podman[105144]: 2025-12-01 09:16:55.432721433 +0000 UTC m=+1.177141497 container remove 87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_heyrovsky, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:16:55 np0005540741 systemd[1]: libpod-conmon-87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9.scope: Deactivated successfully.
Dec  1 04:16:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:16:55 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:16:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:16:55 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:16:55 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 1c2fefe1-da38-4e99-98df-facfe1210edb does not exist
Dec  1 04:16:55 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.11 scrub starts
Dec  1 04:16:55 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.11 scrub ok
Dec  1 04:16:56 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.d scrub starts
Dec  1 04:16:56 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.d scrub ok
Dec  1 04:16:56 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:16:56 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:16:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v150: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:57 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.c scrub starts
Dec  1 04:16:57 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.c scrub ok
Dec  1 04:16:57 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1d scrub starts
Dec  1 04:16:57 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1d scrub ok
Dec  1 04:16:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:16:59 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Dec  1 04:16:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v151: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:16:59 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Dec  1 04:16:59 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1f scrub starts
Dec  1 04:16:59 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1f scrub ok
Dec  1 04:16:59 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Dec  1 04:16:59 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Dec  1 04:17:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v152: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:01 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.c deep-scrub starts
Dec  1 04:17:01 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.c deep-scrub ok
Dec  1 04:17:01 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.13 scrub starts
Dec  1 04:17:01 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.13 scrub ok
Dec  1 04:17:02 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.d scrub starts
Dec  1 04:17:02 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.d scrub ok
Dec  1 04:17:02 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Dec  1 04:17:02 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Dec  1 04:17:02 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.15 scrub starts
Dec  1 04:17:02 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.15 scrub ok
Dec  1 04:17:03 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1 deep-scrub starts
Dec  1 04:17:03 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1 deep-scrub ok
Dec  1 04:17:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v153: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:17:03 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.1f deep-scrub starts
Dec  1 04:17:03 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.1f deep-scrub ok
Dec  1 04:17:04 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.f scrub starts
Dec  1 04:17:04 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.f scrub ok
Dec  1 04:17:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.5 deep-scrub starts
Dec  1 04:17:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.5 deep-scrub ok
Dec  1 04:17:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v154: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:06 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Dec  1 04:17:06 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Dec  1 04:17:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v155: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:07 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.17 scrub starts
Dec  1 04:17:07 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.17 scrub ok
Dec  1 04:17:07 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Dec  1 04:17:07 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Dec  1 04:17:08 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.13 scrub starts
Dec  1 04:17:08 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.13 scrub ok
Dec  1 04:17:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:17:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v156: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v157: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:12 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Dec  1 04:17:12 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Dec  1 04:17:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:17:12
Dec  1 04:17:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:17:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:17:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['vms', 'cephfs.cephfs.data', 'backups', 'cephfs.cephfs.meta', 'volumes', 'images', '.mgr']
Dec  1 04:17:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:17:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v158: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:17:14 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.3 deep-scrub starts
Dec  1 04:17:14 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.3 deep-scrub ok
Dec  1 04:17:14 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Dec  1 04:17:14 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Dec  1 04:17:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v159: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:15 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Dec  1 04:17:15 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Dec  1 04:17:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v160: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:18 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Dec  1 04:17:18 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Dec  1 04:17:18 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.12 scrub starts
Dec  1 04:17:18 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.12 scrub ok
Dec  1 04:17:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:17:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:17:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:17:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:17:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:17:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:17:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:17:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:17:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:17:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:17:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:17:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:17:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:17:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:17:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:17:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:17:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v161: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:19 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.f scrub starts
Dec  1 04:17:19 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.f scrub ok
Dec  1 04:17:20 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Dec  1 04:17:20 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Dec  1 04:17:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v162: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v163: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:23 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Dec  1 04:17:23 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Dec  1 04:17:23 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Dec  1 04:17:23 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Dec  1 04:17:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:17:24 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.c deep-scrub starts
Dec  1 04:17:24 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.c deep-scrub ok
Dec  1 04:17:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v164: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:26 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.16 scrub starts
Dec  1 04:17:26 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.16 scrub ok
Dec  1 04:17:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v165: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:27 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.f scrub starts
Dec  1 04:17:27 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.f scrub ok
Dec  1 04:17:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:17:28 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Dec  1 04:17:28 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Dec  1 04:17:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v166: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:29 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Dec  1 04:17:29 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Dec  1 04:17:30 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Dec  1 04:17:30 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Dec  1 04:17:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v167: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.12 deep-scrub starts
Dec  1 04:17:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.12 deep-scrub ok
Dec  1 04:17:32 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.6 deep-scrub starts
Dec  1 04:17:32 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.6 deep-scrub ok
Dec  1 04:17:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v168: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:33 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.3 scrub starts
Dec  1 04:17:33 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.3 scrub ok
Dec  1 04:17:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:17:33 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Dec  1 04:17:33 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Dec  1 04:17:34 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Dec  1 04:17:34 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Dec  1 04:17:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v169: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:35 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.a scrub starts
Dec  1 04:17:35 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.a scrub ok
Dec  1 04:17:36 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Dec  1 04:17:36 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Dec  1 04:17:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v170: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:37 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.4 scrub starts
Dec  1 04:17:37 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.4 scrub ok
Dec  1 04:17:38 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Dec  1 04:17:38 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Dec  1 04:17:38 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:17:38 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.e scrub starts
Dec  1 04:17:38 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.e scrub ok
Dec  1 04:17:38 np0005540741 python3.9[105483]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:17:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v171: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:40 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Dec  1 04:17:40 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Dec  1 04:17:40 np0005540741 python3.9[105770]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec  1 04:17:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v172: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:41 np0005540741 python3.9[105922]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec  1 04:17:41 np0005540741 python3.9[106074]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:17:42 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.a scrub starts
Dec  1 04:17:42 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.a scrub ok
Dec  1 04:17:42 np0005540741 python3.9[106226]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec  1 04:17:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:17:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:17:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:17:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:17:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:17:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:17:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v173: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:17:44 np0005540741 python3.9[106378]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:17:44 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Dec  1 04:17:44 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Dec  1 04:17:44 np0005540741 python3.9[106530]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:17:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v174: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:45 np0005540741 python3.9[106608]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:17:45 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.18 scrub starts
Dec  1 04:17:45 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.18 scrub ok
Dec  1 04:17:46 np0005540741 python3.9[106760]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:17:46 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.a scrub starts
Dec  1 04:17:46 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.a scrub ok
Dec  1 04:17:46 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Dec  1 04:17:46 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Dec  1 04:17:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v175: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:47 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1f scrub starts
Dec  1 04:17:47 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1f scrub ok
Dec  1 04:17:47 np0005540741 python3.9[106914]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec  1 04:17:48 np0005540741 python3.9[107067]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec  1 04:17:48 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.d scrub starts
Dec  1 04:17:48 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.d scrub ok
Dec  1 04:17:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:17:48 np0005540741 python3.9[107220]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 04:17:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v176: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:49 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1b scrub starts
Dec  1 04:17:49 np0005540741 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1b scrub ok
Dec  1 04:17:49 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Dec  1 04:17:49 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Dec  1 04:17:49 np0005540741 python3.9[107372]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec  1 04:17:50 np0005540741 python3.9[107524]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:17:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v177: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:51 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Dec  1 04:17:51 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Dec  1 04:17:52 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Dec  1 04:17:52 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Dec  1 04:17:52 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Dec  1 04:17:52 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Dec  1 04:17:52 np0005540741 python3.9[107677]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:17:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v178: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:53 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Dec  1 04:17:53 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Dec  1 04:17:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:17:53 np0005540741 python3.9[107829]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:17:54 np0005540741 python3.9[107907]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:17:54 np0005540741 python3.9[108059]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:17:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v179: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:55 np0005540741 python3.9[108137]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:17:56 np0005540741 python3.9[108363]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:17:56 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Dec  1 04:17:56 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Dec  1 04:17:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:17:56 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:17:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:17:56 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:17:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:17:56 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:17:56 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 508a59e6-abeb-4f7e-84f6-648341ece6d2 does not exist
Dec  1 04:17:56 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 601325cf-341b-4646-8184-e7d4acb0a43a does not exist
Dec  1 04:17:56 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 8722566b-2e50-492c-b4f2-e2f4f2e5c366 does not exist
Dec  1 04:17:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:17:56 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:17:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:17:56 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:17:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:17:56 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:17:56 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:17:56 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:17:56 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:17:56 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.2 deep-scrub starts
Dec  1 04:17:56 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.2 deep-scrub ok
Dec  1 04:17:56 np0005540741 podman[108560]: 2025-12-01 09:17:56.99070135 +0000 UTC m=+0.047149136 container create 0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_jennings, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2)
Dec  1 04:17:57 np0005540741 systemd[1]: Started libpod-conmon-0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609.scope.
Dec  1 04:17:57 np0005540741 podman[108560]: 2025-12-01 09:17:56.969155435 +0000 UTC m=+0.025603221 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:17:57 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:17:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v180: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:57 np0005540741 podman[108560]: 2025-12-01 09:17:57.102937712 +0000 UTC m=+0.159385538 container init 0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_jennings, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:17:57 np0005540741 podman[108560]: 2025-12-01 09:17:57.114221654 +0000 UTC m=+0.170669440 container start 0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_jennings, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  1 04:17:57 np0005540741 podman[108560]: 2025-12-01 09:17:57.119265758 +0000 UTC m=+0.175713524 container attach 0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_jennings, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:17:57 np0005540741 nice_jennings[108576]: 167 167
Dec  1 04:17:57 np0005540741 systemd[1]: libpod-0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609.scope: Deactivated successfully.
Dec  1 04:17:57 np0005540741 podman[108560]: 2025-12-01 09:17:57.123329384 +0000 UTC m=+0.179777160 container died 0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_jennings, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:17:57 np0005540741 systemd[1]: var-lib-containers-storage-overlay-2af10e1e43a76bdf5d8e118a39cc600d273d25af88865c277b99ecfd2945f6c3-merged.mount: Deactivated successfully.
Dec  1 04:17:57 np0005540741 podman[108560]: 2025-12-01 09:17:57.160839644 +0000 UTC m=+0.217287410 container remove 0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_jennings, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:17:57 np0005540741 systemd[1]: libpod-conmon-0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609.scope: Deactivated successfully.
Dec  1 04:17:57 np0005540741 podman[108601]: 2025-12-01 09:17:57.338871163 +0000 UTC m=+0.046195049 container create 56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_yalow, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True)
Dec  1 04:17:57 np0005540741 systemd[1]: Started libpod-conmon-56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439.scope.
Dec  1 04:17:57 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:17:57 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb5b3d103514b8c4fb6619ee9c66c4f491a680a111199139f2229db15586aeb0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:17:57 np0005540741 podman[108601]: 2025-12-01 09:17:57.318515303 +0000 UTC m=+0.025839219 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:17:57 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb5b3d103514b8c4fb6619ee9c66c4f491a680a111199139f2229db15586aeb0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:17:57 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb5b3d103514b8c4fb6619ee9c66c4f491a680a111199139f2229db15586aeb0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:17:57 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb5b3d103514b8c4fb6619ee9c66c4f491a680a111199139f2229db15586aeb0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:17:57 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb5b3d103514b8c4fb6619ee9c66c4f491a680a111199139f2229db15586aeb0/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:17:57 np0005540741 podman[108601]: 2025-12-01 09:17:57.444013473 +0000 UTC m=+0.151337389 container init 56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_yalow, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:17:57 np0005540741 podman[108601]: 2025-12-01 09:17:57.452072913 +0000 UTC m=+0.159396799 container start 56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_yalow, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:17:57 np0005540741 podman[108601]: 2025-12-01 09:17:57.457316093 +0000 UTC m=+0.164640009 container attach 56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_yalow, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Dec  1 04:17:58 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.b scrub starts
Dec  1 04:17:58 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.b scrub ok
Dec  1 04:17:58 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Dec  1 04:17:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:17:58 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Dec  1 04:17:58 np0005540741 eager_yalow[108618]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:17:58 np0005540741 eager_yalow[108618]: --> relative data size: 1.0
Dec  1 04:17:58 np0005540741 eager_yalow[108618]: --> All data devices are unavailable
Dec  1 04:17:58 np0005540741 python3.9[108786]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:17:58 np0005540741 systemd[1]: libpod-56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439.scope: Deactivated successfully.
Dec  1 04:17:58 np0005540741 systemd[1]: libpod-56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439.scope: Consumed 1.037s CPU time.
Dec  1 04:17:58 np0005540741 podman[108601]: 2025-12-01 09:17:58.561190705 +0000 UTC m=+1.268514591 container died 56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_yalow, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3)
Dec  1 04:17:58 np0005540741 systemd[1]: var-lib-containers-storage-overlay-eb5b3d103514b8c4fb6619ee9c66c4f491a680a111199139f2229db15586aeb0-merged.mount: Deactivated successfully.
Dec  1 04:17:58 np0005540741 podman[108601]: 2025-12-01 09:17:58.617127291 +0000 UTC m=+1.324451167 container remove 56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_yalow, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:17:58 np0005540741 systemd[1]: libpod-conmon-56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439.scope: Deactivated successfully.
Dec  1 04:17:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v181: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:17:59 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Dec  1 04:17:59 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Dec  1 04:17:59 np0005540741 podman[109097]: 2025-12-01 09:17:59.284649954 +0000 UTC m=+0.046818486 container create ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_einstein, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:17:59 np0005540741 systemd[1]: Started libpod-conmon-ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c.scope.
Dec  1 04:17:59 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:17:59 np0005540741 podman[109097]: 2025-12-01 09:17:59.261084922 +0000 UTC m=+0.023253474 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:17:59 np0005540741 python3.9[109091]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec  1 04:17:59 np0005540741 podman[109097]: 2025-12-01 09:17:59.367429806 +0000 UTC m=+0.129598368 container init ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_einstein, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:17:59 np0005540741 podman[109097]: 2025-12-01 09:17:59.379973894 +0000 UTC m=+0.142142436 container start ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_einstein, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507)
Dec  1 04:17:59 np0005540741 infallible_einstein[109114]: 167 167
Dec  1 04:17:59 np0005540741 podman[109097]: 2025-12-01 09:17:59.385918204 +0000 UTC m=+0.148086776 container attach ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_einstein, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec  1 04:17:59 np0005540741 systemd[1]: libpod-ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c.scope: Deactivated successfully.
Dec  1 04:17:59 np0005540741 podman[109097]: 2025-12-01 09:17:59.387595281 +0000 UTC m=+0.149763883 container died ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_einstein, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:17:59 np0005540741 systemd[1]: var-lib-containers-storage-overlay-4d61e15a1c8425063b55a03892a92ef5a382207c04fdd320093c10c278c8b561-merged.mount: Deactivated successfully.
Dec  1 04:17:59 np0005540741 podman[109097]: 2025-12-01 09:17:59.446527863 +0000 UTC m=+0.208696435 container remove ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_einstein, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec  1 04:17:59 np0005540741 systemd[1]: libpod-conmon-ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c.scope: Deactivated successfully.
Dec  1 04:17:59 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.c deep-scrub starts
Dec  1 04:17:59 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.c deep-scrub ok
Dec  1 04:17:59 np0005540741 podman[109184]: 2025-12-01 09:17:59.649969667 +0000 UTC m=+0.048403602 container create 3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_sinoussi, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec  1 04:17:59 np0005540741 systemd[1]: Started libpod-conmon-3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2.scope.
Dec  1 04:17:59 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:17:59 np0005540741 podman[109184]: 2025-12-01 09:17:59.629860743 +0000 UTC m=+0.028294698 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:17:59 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef126a3aab5a2f87974a167fa5f64ac43bb71f0c9fbef155d675d34b54d65d84/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:17:59 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef126a3aab5a2f87974a167fa5f64ac43bb71f0c9fbef155d675d34b54d65d84/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:17:59 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef126a3aab5a2f87974a167fa5f64ac43bb71f0c9fbef155d675d34b54d65d84/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:17:59 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef126a3aab5a2f87974a167fa5f64ac43bb71f0c9fbef155d675d34b54d65d84/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:17:59 np0005540741 podman[109184]: 2025-12-01 09:17:59.744527294 +0000 UTC m=+0.142961259 container init 3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_sinoussi, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec  1 04:17:59 np0005540741 podman[109184]: 2025-12-01 09:17:59.751609727 +0000 UTC m=+0.150043672 container start 3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_sinoussi, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:17:59 np0005540741 podman[109184]: 2025-12-01 09:17:59.755676332 +0000 UTC m=+0.154110287 container attach 3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_sinoussi, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:18:00 np0005540741 python3.9[109309]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:18:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.7 scrub starts
Dec  1 04:18:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.7 scrub ok
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]: {
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:    "0": [
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:        {
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "devices": [
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "/dev/loop3"
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            ],
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "lv_name": "ceph_lv0",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "lv_size": "21470642176",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "name": "ceph_lv0",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "tags": {
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.cluster_name": "ceph",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.crush_device_class": "",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.encrypted": "0",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.osd_id": "0",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.type": "block",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.vdo": "0"
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            },
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "type": "block",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "vg_name": "ceph_vg0"
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:        }
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:    ],
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:    "1": [
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:        {
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "devices": [
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "/dev/loop4"
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            ],
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "lv_name": "ceph_lv1",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "lv_size": "21470642176",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "name": "ceph_lv1",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "tags": {
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.cluster_name": "ceph",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.crush_device_class": "",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.encrypted": "0",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.osd_id": "1",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.type": "block",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.vdo": "0"
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            },
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "type": "block",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "vg_name": "ceph_vg1"
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:        }
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:    ],
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:    "2": [
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:        {
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "devices": [
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "/dev/loop5"
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            ],
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "lv_name": "ceph_lv2",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "lv_size": "21470642176",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "name": "ceph_lv2",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "tags": {
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.cluster_name": "ceph",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.crush_device_class": "",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.encrypted": "0",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.osd_id": "2",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.type": "block",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:                "ceph.vdo": "0"
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            },
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "type": "block",
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:            "vg_name": "ceph_vg2"
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:        }
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]:    ]
Dec  1 04:18:00 np0005540741 flamboyant_sinoussi[109252]: }
Dec  1 04:18:00 np0005540741 systemd[1]: libpod-3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2.scope: Deactivated successfully.
Dec  1 04:18:00 np0005540741 podman[109184]: 2025-12-01 09:18:00.541997146 +0000 UTC m=+0.940431081 container died 3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_sinoussi, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  1 04:18:00 np0005540741 systemd[1]: var-lib-containers-storage-overlay-ef126a3aab5a2f87974a167fa5f64ac43bb71f0c9fbef155d675d34b54d65d84-merged.mount: Deactivated successfully.
Dec  1 04:18:00 np0005540741 podman[109184]: 2025-12-01 09:18:00.601108883 +0000 UTC m=+0.999542818 container remove 3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_sinoussi, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True)
Dec  1 04:18:00 np0005540741 systemd[1]: libpod-conmon-3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2.scope: Deactivated successfully.
Dec  1 04:18:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v182: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:01 np0005540741 podman[109564]: 2025-12-01 09:18:01.355346081 +0000 UTC m=+0.095622069 container create 2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_euler, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:18:01 np0005540741 podman[109564]: 2025-12-01 09:18:01.289087761 +0000 UTC m=+0.029363839 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:18:01 np0005540741 systemd[1]: Started libpod-conmon-2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893.scope.
Dec  1 04:18:01 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:18:01 np0005540741 podman[109564]: 2025-12-01 09:18:01.444571017 +0000 UTC m=+0.184847055 container init 2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_euler, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:18:01 np0005540741 podman[109564]: 2025-12-01 09:18:01.452858493 +0000 UTC m=+0.193134491 container start 2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_euler, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:18:01 np0005540741 hardcore_euler[109633]: 167 167
Dec  1 04:18:01 np0005540741 systemd[1]: libpod-2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893.scope: Deactivated successfully.
Dec  1 04:18:01 np0005540741 conmon[109633]: conmon 2f555bb1fa017585d7a8 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893.scope/container/memory.events
Dec  1 04:18:01 np0005540741 podman[109564]: 2025-12-01 09:18:01.460743358 +0000 UTC m=+0.201019346 container attach 2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_euler, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec  1 04:18:01 np0005540741 podman[109564]: 2025-12-01 09:18:01.461641274 +0000 UTC m=+0.201917262 container died 2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_euler, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec  1 04:18:01 np0005540741 systemd[1]: var-lib-containers-storage-overlay-c81106908f25a4f9a1dddd7354a9b5a02d63b40b985a538cb95eb5867727dddb-merged.mount: Deactivated successfully.
Dec  1 04:18:01 np0005540741 podman[109564]: 2025-12-01 09:18:01.532600809 +0000 UTC m=+0.272876807 container remove 2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_euler, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec  1 04:18:01 np0005540741 systemd[1]: libpod-conmon-2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893.scope: Deactivated successfully.
Dec  1 04:18:01 np0005540741 python3.9[109635]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:18:01 np0005540741 podman[109660]: 2025-12-01 09:18:01.726752438 +0000 UTC m=+0.051876401 container create d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_keller, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 04:18:01 np0005540741 systemd[1]: Started libpod-conmon-d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684.scope.
Dec  1 04:18:01 np0005540741 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec  1 04:18:01 np0005540741 podman[109660]: 2025-12-01 09:18:01.704378919 +0000 UTC m=+0.029502892 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:18:01 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:18:01 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8d3b2734e6ac35c6d18e38c1d6c6f82c3055efaed8f2e82295d1926b562028a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:18:01 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8d3b2734e6ac35c6d18e38c1d6c6f82c3055efaed8f2e82295d1926b562028a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:18:01 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8d3b2734e6ac35c6d18e38c1d6c6f82c3055efaed8f2e82295d1926b562028a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:18:01 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8d3b2734e6ac35c6d18e38c1d6c6f82c3055efaed8f2e82295d1926b562028a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:18:01 np0005540741 podman[109660]: 2025-12-01 09:18:01.823441546 +0000 UTC m=+0.148565529 container init d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_keller, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:18:01 np0005540741 podman[109660]: 2025-12-01 09:18:01.838871786 +0000 UTC m=+0.163995759 container start d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_keller, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:18:01 np0005540741 podman[109660]: 2025-12-01 09:18:01.843584271 +0000 UTC m=+0.168708274 container attach d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_keller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:18:01 np0005540741 systemd[1]: tuned.service: Deactivated successfully.
Dec  1 04:18:01 np0005540741 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec  1 04:18:01 np0005540741 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  1 04:18:02 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.e scrub starts
Dec  1 04:18:02 np0005540741 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  1 04:18:02 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.e scrub ok
Dec  1 04:18:02 np0005540741 practical_keller[109681]: {
Dec  1 04:18:02 np0005540741 practical_keller[109681]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:18:02 np0005540741 practical_keller[109681]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:18:02 np0005540741 practical_keller[109681]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:18:02 np0005540741 practical_keller[109681]:        "osd_id": 0,
Dec  1 04:18:02 np0005540741 practical_keller[109681]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:18:02 np0005540741 practical_keller[109681]:        "type": "bluestore"
Dec  1 04:18:02 np0005540741 practical_keller[109681]:    },
Dec  1 04:18:02 np0005540741 practical_keller[109681]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:18:02 np0005540741 practical_keller[109681]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:18:02 np0005540741 practical_keller[109681]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:18:02 np0005540741 practical_keller[109681]:        "osd_id": 1,
Dec  1 04:18:02 np0005540741 practical_keller[109681]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:18:02 np0005540741 practical_keller[109681]:        "type": "bluestore"
Dec  1 04:18:02 np0005540741 practical_keller[109681]:    },
Dec  1 04:18:02 np0005540741 practical_keller[109681]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:18:02 np0005540741 practical_keller[109681]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:18:02 np0005540741 practical_keller[109681]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:18:02 np0005540741 practical_keller[109681]:        "osd_id": 2,
Dec  1 04:18:02 np0005540741 practical_keller[109681]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:18:02 np0005540741 practical_keller[109681]:        "type": "bluestore"
Dec  1 04:18:02 np0005540741 practical_keller[109681]:    }
Dec  1 04:18:02 np0005540741 practical_keller[109681]: }
Dec  1 04:18:02 np0005540741 systemd[1]: libpod-d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684.scope: Deactivated successfully.
Dec  1 04:18:02 np0005540741 systemd[1]: libpod-d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684.scope: Consumed 1.095s CPU time.
Dec  1 04:18:02 np0005540741 python3.9[109860]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec  1 04:18:02 np0005540741 podman[109872]: 2025-12-01 09:18:02.980358192 +0000 UTC m=+0.029968026 container died d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_keller, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:18:03 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b8d3b2734e6ac35c6d18e38c1d6c6f82c3055efaed8f2e82295d1926b562028a-merged.mount: Deactivated successfully.
Dec  1 04:18:03 np0005540741 podman[109872]: 2025-12-01 09:18:03.045686366 +0000 UTC m=+0.095296170 container remove d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_keller, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:18:03 np0005540741 systemd[1]: libpod-conmon-d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684.scope: Deactivated successfully.
Dec  1 04:18:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:18:03 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:18:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v183: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:18:03 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:18:03 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 01fcc80d-6709-44b6-be41-3e809551f01e does not exist
Dec  1 04:18:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:18:03 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:18:03 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:18:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v184: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Dec  1 04:18:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Dec  1 04:18:05 np0005540741 python3.9[110088]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:18:06 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.8 deep-scrub starts
Dec  1 04:18:06 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.8 deep-scrub ok
Dec  1 04:18:06 np0005540741 python3.9[110242]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:18:06 np0005540741 systemd[1]: session-35.scope: Deactivated successfully.
Dec  1 04:18:06 np0005540741 systemd[1]: session-35.scope: Consumed 1min 12.517s CPU time.
Dec  1 04:18:06 np0005540741 systemd-logind[788]: Session 35 logged out. Waiting for processes to exit.
Dec  1 04:18:06 np0005540741 systemd-logind[788]: Removed session 35.
Dec  1 04:18:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v185: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:07 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Dec  1 04:18:07 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Dec  1 04:18:08 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Dec  1 04:18:08 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Dec  1 04:18:08 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1a scrub starts
Dec  1 04:18:08 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1a scrub ok
Dec  1 04:18:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:18:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v186: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v187: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:11 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Dec  1 04:18:11 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Dec  1 04:18:12 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Dec  1 04:18:12 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Dec  1 04:18:12 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Dec  1 04:18:12 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Dec  1 04:18:12 np0005540741 systemd-logind[788]: New session 36 of user zuul.
Dec  1 04:18:12 np0005540741 systemd[1]: Started Session 36 of User zuul.
Dec  1 04:18:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:18:12
Dec  1 04:18:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:18:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:18:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['images', 'cephfs.cephfs.meta', 'vms', '.mgr', 'volumes', 'cephfs.cephfs.data', 'backups']
Dec  1 04:18:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:18:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v188: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:13 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Dec  1 04:18:13 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Dec  1 04:18:13 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Dec  1 04:18:13 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Dec  1 04:18:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:18:13 np0005540741 python3.9[110422]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:18:14 np0005540741 python3.9[110578]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec  1 04:18:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v189: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:15 np0005540741 python3.9[110731]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:18:16 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Dec  1 04:18:16 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Dec  1 04:18:16 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Dec  1 04:18:16 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Dec  1 04:18:16 np0005540741 python3.9[110815]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  1 04:18:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v190: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:17 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1e scrub starts
Dec  1 04:18:17 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1e scrub ok
Dec  1 04:18:18 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.1b deep-scrub starts
Dec  1 04:18:18 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.1b deep-scrub ok
Dec  1 04:18:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:18:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:18:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:18:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:18:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:18:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:18:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:18:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:18:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:18:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:18:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:18:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:18:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:18:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:18:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:18:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:18:18 np0005540741 python3.9[110968]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:18:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v191: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v192: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:21 np0005540741 python3.9[111121]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 04:18:22 np0005540741 python3.9[111274]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:18:22 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.e scrub starts
Dec  1 04:18:22 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.e scrub ok
Dec  1 04:18:22 np0005540741 python3.9[111426]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec  1 04:18:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v193: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:18:23 np0005540741 python3.9[111576]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:18:24 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1d scrub starts
Dec  1 04:18:24 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1d scrub ok
Dec  1 04:18:24 np0005540741 python3.9[111734]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:18:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v194: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:26 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1c scrub starts
Dec  1 04:18:26 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1c scrub ok
Dec  1 04:18:26 np0005540741 python3.9[111887]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:18:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v195: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:27 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Dec  1 04:18:27 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Dec  1 04:18:28 np0005540741 python3.9[112174]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  1 04:18:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:18:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v196: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:29 np0005540741 python3.9[112324]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:18:29 np0005540741 python3.9[112478]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:18:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v197: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:32 np0005540741 python3.9[112631]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:18:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v198: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:18:34 np0005540741 python3.9[112784]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:18:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v199: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:35 np0005540741 python3.9[112938]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Dec  1 04:18:36 np0005540741 systemd[1]: session-36.scope: Deactivated successfully.
Dec  1 04:18:36 np0005540741 systemd[1]: session-36.scope: Consumed 17.930s CPU time.
Dec  1 04:18:36 np0005540741 systemd-logind[788]: Session 36 logged out. Waiting for processes to exit.
Dec  1 04:18:36 np0005540741 systemd-logind[788]: Removed session 36.
Dec  1 04:18:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v200: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:38 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:18:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v201: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v202: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:41 np0005540741 systemd-logind[788]: New session 37 of user zuul.
Dec  1 04:18:41 np0005540741 systemd[1]: Started Session 37 of User zuul.
Dec  1 04:18:42 np0005540741 python3.9[113116]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:18:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:18:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:18:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:18:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:18:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:18:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:18:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v203: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:43 np0005540741 python3.9[113270]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:18:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:18:44 np0005540741 python3.9[113463]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:18:45 np0005540741 systemd[1]: session-37.scope: Deactivated successfully.
Dec  1 04:18:45 np0005540741 systemd[1]: session-37.scope: Consumed 2.533s CPU time.
Dec  1 04:18:45 np0005540741 systemd-logind[788]: Session 37 logged out. Waiting for processes to exit.
Dec  1 04:18:45 np0005540741 systemd-logind[788]: Removed session 37.
Dec  1 04:18:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v204: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v205: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:18:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v206: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:50 np0005540741 systemd-logind[788]: New session 38 of user zuul.
Dec  1 04:18:50 np0005540741 systemd[1]: Started Session 38 of User zuul.
Dec  1 04:18:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v207: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:51 np0005540741 python3.9[113643]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:18:52 np0005540741 python3.9[113797]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:18:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v208: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:18:53 np0005540741 python3.9[113953]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:18:54 np0005540741 python3.9[114037]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:18:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v209: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:56 np0005540741 python3.9[114190]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:18:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v210: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:58 np0005540741 python3.9[114385]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:18:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:18:58 np0005540741 python3.9[114537]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:18:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v211: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:18:59 np0005540741 python3.9[114702]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:19:00 np0005540741 python3.9[114780]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:00 np0005540741 python3.9[114932]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:19:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v212: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:01 np0005540741 python3.9[115010]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:19:02 np0005540741 python3.9[115162]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:19:02 np0005540741 python3.9[115314]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:19:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v213: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:19:03 np0005540741 python3.9[115489]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:19:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:19:04 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:19:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:19:04 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:19:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:19:04 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:19:04 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 0fcfb794-f73b-477c-a5ce-19a94a36b49b does not exist
Dec  1 04:19:04 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 4021f264-8bed-4771-a488-a27a1e1c94a5 does not exist
Dec  1 04:19:04 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 6baee5d1-4cc6-43fe-a6f2-2dd304f92091 does not exist
Dec  1 04:19:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:19:04 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:19:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:19:04 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:19:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:19:04 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:19:04 np0005540741 python3.9[115747]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:19:04 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:19:04 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:19:04 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:19:04 np0005540741 podman[115990]: 2025-12-01 09:19:04.618244384 +0000 UTC m=+0.043629910 container create e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_franklin, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec  1 04:19:04 np0005540741 systemd[76658]: Created slice User Background Tasks Slice.
Dec  1 04:19:04 np0005540741 systemd[76658]: Starting Cleanup of User's Temporary Files and Directories...
Dec  1 04:19:04 np0005540741 systemd[1]: Started libpod-conmon-e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84.scope.
Dec  1 04:19:04 np0005540741 systemd[76658]: Finished Cleanup of User's Temporary Files and Directories.
Dec  1 04:19:04 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:19:04 np0005540741 podman[115990]: 2025-12-01 09:19:04.599886626 +0000 UTC m=+0.025272162 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:19:04 np0005540741 podman[115990]: 2025-12-01 09:19:04.706882211 +0000 UTC m=+0.132267757 container init e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_franklin, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:19:04 np0005540741 podman[115990]: 2025-12-01 09:19:04.717402799 +0000 UTC m=+0.142788315 container start e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_franklin, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec  1 04:19:04 np0005540741 podman[115990]: 2025-12-01 09:19:04.721510549 +0000 UTC m=+0.146896095 container attach e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_franklin, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec  1 04:19:04 np0005540741 systemd[1]: libpod-e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84.scope: Deactivated successfully.
Dec  1 04:19:04 np0005540741 nifty_franklin[116053]: 167 167
Dec  1 04:19:04 np0005540741 conmon[116053]: conmon e2d561b52a01002b55b5 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84.scope/container/memory.events
Dec  1 04:19:04 np0005540741 podman[115990]: 2025-12-01 09:19:04.72803239 +0000 UTC m=+0.153417906 container died e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_franklin, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  1 04:19:04 np0005540741 systemd[1]: var-lib-containers-storage-overlay-df462a973db5f2102e7c8467d4330e423558579b482123e47f23774a42bc3cc2-merged.mount: Deactivated successfully.
Dec  1 04:19:04 np0005540741 podman[115990]: 2025-12-01 09:19:04.769633789 +0000 UTC m=+0.195019315 container remove e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_franklin, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:19:04 np0005540741 systemd[1]: libpod-conmon-e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84.scope: Deactivated successfully.
Dec  1 04:19:04 np0005540741 python3.9[116063]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:19:04 np0005540741 podman[116085]: 2025-12-01 09:19:04.950108837 +0000 UTC m=+0.054129637 container create 3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_wescoff, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:19:04 np0005540741 systemd[1]: Started libpod-conmon-3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1.scope.
Dec  1 04:19:05 np0005540741 podman[116085]: 2025-12-01 09:19:04.923826717 +0000 UTC m=+0.027847607 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:19:05 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:19:05 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/353aefb266033841909c2dea13bb09560e1ccfdda1c6fcbe9316a056e8743818/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:19:05 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/353aefb266033841909c2dea13bb09560e1ccfdda1c6fcbe9316a056e8743818/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:19:05 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/353aefb266033841909c2dea13bb09560e1ccfdda1c6fcbe9316a056e8743818/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:19:05 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/353aefb266033841909c2dea13bb09560e1ccfdda1c6fcbe9316a056e8743818/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:19:05 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/353aefb266033841909c2dea13bb09560e1ccfdda1c6fcbe9316a056e8743818/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:19:05 np0005540741 podman[116085]: 2025-12-01 09:19:05.038714193 +0000 UTC m=+0.142735013 container init 3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_wescoff, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  1 04:19:05 np0005540741 podman[116085]: 2025-12-01 09:19:05.048980484 +0000 UTC m=+0.153001284 container start 3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_wescoff, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec  1 04:19:05 np0005540741 podman[116085]: 2025-12-01 09:19:05.05259094 +0000 UTC m=+0.156611780 container attach 3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_wescoff, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:19:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v214: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:06 np0005540741 hungry_wescoff[116103]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:19:06 np0005540741 hungry_wescoff[116103]: --> relative data size: 1.0
Dec  1 04:19:06 np0005540741 hungry_wescoff[116103]: --> All data devices are unavailable
Dec  1 04:19:06 np0005540741 systemd[1]: libpod-3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1.scope: Deactivated successfully.
Dec  1 04:19:06 np0005540741 systemd[1]: libpod-3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1.scope: Consumed 1.113s CPU time.
Dec  1 04:19:06 np0005540741 podman[116132]: 2025-12-01 09:19:06.258852163 +0000 UTC m=+0.033035809 container died 3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_wescoff, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:19:06 np0005540741 systemd[1]: var-lib-containers-storage-overlay-353aefb266033841909c2dea13bb09560e1ccfdda1c6fcbe9316a056e8743818-merged.mount: Deactivated successfully.
Dec  1 04:19:06 np0005540741 podman[116132]: 2025-12-01 09:19:06.329704669 +0000 UTC m=+0.103888315 container remove 3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_wescoff, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:19:06 np0005540741 systemd[1]: libpod-conmon-3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1.scope: Deactivated successfully.
Dec  1 04:19:06 np0005540741 podman[116334]: 2025-12-01 09:19:06.953249979 +0000 UTC m=+0.045133813 container create 5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_khorana, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:19:06 np0005540741 systemd[1]: Started libpod-conmon-5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401.scope.
Dec  1 04:19:07 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:19:07 np0005540741 podman[116334]: 2025-12-01 09:19:06.935487769 +0000 UTC m=+0.027371643 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:19:07 np0005540741 podman[116334]: 2025-12-01 09:19:07.045364618 +0000 UTC m=+0.137248492 container init 5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_khorana, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:19:07 np0005540741 podman[116334]: 2025-12-01 09:19:07.053105475 +0000 UTC m=+0.144989309 container start 5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_khorana, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:19:07 np0005540741 podman[116334]: 2025-12-01 09:19:07.057417821 +0000 UTC m=+0.149301665 container attach 5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_khorana, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec  1 04:19:07 np0005540741 condescending_khorana[116401]: 167 167
Dec  1 04:19:07 np0005540741 systemd[1]: libpod-5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401.scope: Deactivated successfully.
Dec  1 04:19:07 np0005540741 podman[116334]: 2025-12-01 09:19:07.062202141 +0000 UTC m=+0.154085985 container died 5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_khorana, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec  1 04:19:07 np0005540741 systemd[1]: var-lib-containers-storage-overlay-1184d164b0653522a65e11bc4631e91d7e93b70056f5e382f92fb46885ec9951-merged.mount: Deactivated successfully.
Dec  1 04:19:07 np0005540741 podman[116334]: 2025-12-01 09:19:07.102043059 +0000 UTC m=+0.193926913 container remove 5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_khorana, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:19:07 np0005540741 systemd[1]: libpod-conmon-5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401.scope: Deactivated successfully.
Dec  1 04:19:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v215: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:07 np0005540741 podman[116479]: 2025-12-01 09:19:07.287017307 +0000 UTC m=+0.051635304 container create e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_antonelli, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec  1 04:19:07 np0005540741 systemd[1]: Started libpod-conmon-e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039.scope.
Dec  1 04:19:07 np0005540741 podman[116479]: 2025-12-01 09:19:07.264047954 +0000 UTC m=+0.028665951 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:19:07 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:19:07 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4742dfe1e3643d4b7d4da31c965b962bb31a4e794f6c08dcc49db142316fce7/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:19:07 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4742dfe1e3643d4b7d4da31c965b962bb31a4e794f6c08dcc49db142316fce7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:19:07 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4742dfe1e3643d4b7d4da31c965b962bb31a4e794f6c08dcc49db142316fce7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:19:07 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4742dfe1e3643d4b7d4da31c965b962bb31a4e794f6c08dcc49db142316fce7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:19:07 np0005540741 podman[116479]: 2025-12-01 09:19:07.392520879 +0000 UTC m=+0.157138866 container init e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_antonelli, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS)
Dec  1 04:19:07 np0005540741 podman[116479]: 2025-12-01 09:19:07.400688338 +0000 UTC m=+0.165306305 container start e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_antonelli, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:19:07 np0005540741 podman[116479]: 2025-12-01 09:19:07.403961544 +0000 UTC m=+0.168579511 container attach e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_antonelli, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec  1 04:19:07 np0005540741 python3.9[116473]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:19:08 np0005540741 python3.9[116653]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]: {
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:    "0": [
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:        {
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "devices": [
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "/dev/loop3"
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            ],
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "lv_name": "ceph_lv0",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "lv_size": "21470642176",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "name": "ceph_lv0",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "tags": {
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.cluster_name": "ceph",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.crush_device_class": "",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.encrypted": "0",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.osd_id": "0",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.type": "block",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.vdo": "0"
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            },
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "type": "block",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "vg_name": "ceph_vg0"
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:        }
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:    ],
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:    "1": [
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:        {
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "devices": [
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "/dev/loop4"
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            ],
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "lv_name": "ceph_lv1",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "lv_size": "21470642176",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "name": "ceph_lv1",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "tags": {
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.cluster_name": "ceph",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.crush_device_class": "",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.encrypted": "0",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.osd_id": "1",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.type": "block",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.vdo": "0"
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            },
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "type": "block",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "vg_name": "ceph_vg1"
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:        }
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:    ],
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:    "2": [
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:        {
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "devices": [
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "/dev/loop5"
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            ],
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "lv_name": "ceph_lv2",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "lv_size": "21470642176",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "name": "ceph_lv2",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "tags": {
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.cluster_name": "ceph",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.crush_device_class": "",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.encrypted": "0",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.osd_id": "2",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.type": "block",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:                "ceph.vdo": "0"
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            },
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "type": "block",
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:            "vg_name": "ceph_vg2"
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:        }
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]:    ]
Dec  1 04:19:08 np0005540741 affectionate_antonelli[116495]: }
Dec  1 04:19:08 np0005540741 systemd[1]: libpod-e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039.scope: Deactivated successfully.
Dec  1 04:19:08 np0005540741 podman[116479]: 2025-12-01 09:19:08.260433908 +0000 UTC m=+1.025051885 container died e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_antonelli, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:19:08 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b4742dfe1e3643d4b7d4da31c965b962bb31a4e794f6c08dcc49db142316fce7-merged.mount: Deactivated successfully.
Dec  1 04:19:08 np0005540741 podman[116479]: 2025-12-01 09:19:08.334599851 +0000 UTC m=+1.099217818 container remove e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_antonelli, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:19:08 np0005540741 systemd[1]: libpod-conmon-e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039.scope: Deactivated successfully.
Dec  1 04:19:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:19:08 np0005540741 python3.9[116919]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:19:08 np0005540741 podman[116959]: 2025-12-01 09:19:08.984764281 +0000 UTC m=+0.038287863 container create 4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poincare, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:19:09 np0005540741 systemd[1]: Started libpod-conmon-4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592.scope.
Dec  1 04:19:09 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:19:09 np0005540741 podman[116959]: 2025-12-01 09:19:09.059722367 +0000 UTC m=+0.113246019 container init 4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poincare, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:19:09 np0005540741 podman[116959]: 2025-12-01 09:19:08.967179636 +0000 UTC m=+0.020703238 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:19:09 np0005540741 podman[116959]: 2025-12-01 09:19:09.07108005 +0000 UTC m=+0.124603632 container start 4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poincare, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec  1 04:19:09 np0005540741 podman[116959]: 2025-12-01 09:19:09.076724835 +0000 UTC m=+0.130248437 container attach 4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poincare, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:19:09 np0005540741 silly_poincare[116998]: 167 167
Dec  1 04:19:09 np0005540741 systemd[1]: libpod-4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592.scope: Deactivated successfully.
Dec  1 04:19:09 np0005540741 podman[116959]: 2025-12-01 09:19:09.078418795 +0000 UTC m=+0.131942417 container died 4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poincare, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  1 04:19:09 np0005540741 systemd[1]: var-lib-containers-storage-overlay-45481367aa360ee87f702722db0e483f1bbf8bee9dc86bed4c14d5d19b4af8b6-merged.mount: Deactivated successfully.
Dec  1 04:19:09 np0005540741 podman[116959]: 2025-12-01 09:19:09.12023686 +0000 UTC m=+0.173760442 container remove 4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poincare, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec  1 04:19:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v216: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:09 np0005540741 systemd[1]: libpod-conmon-4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592.scope: Deactivated successfully.
Dec  1 04:19:09 np0005540741 podman[117047]: 2025-12-01 09:19:09.286349627 +0000 UTC m=+0.048613725 container create 29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_mcclintock, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:19:09 np0005540741 systemd[1]: Started libpod-conmon-29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e.scope.
Dec  1 04:19:09 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:19:09 np0005540741 podman[117047]: 2025-12-01 09:19:09.26769569 +0000 UTC m=+0.029959818 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:19:09 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2d34f8e41fa0af715a9e6fc58726bed2dd599fe8dd93505ce90d5240b2ea45b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:19:09 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2d34f8e41fa0af715a9e6fc58726bed2dd599fe8dd93505ce90d5240b2ea45b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:19:09 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2d34f8e41fa0af715a9e6fc58726bed2dd599fe8dd93505ce90d5240b2ea45b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:19:09 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2d34f8e41fa0af715a9e6fc58726bed2dd599fe8dd93505ce90d5240b2ea45b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:19:09 np0005540741 podman[117047]: 2025-12-01 09:19:09.380767383 +0000 UTC m=+0.143031511 container init 29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_mcclintock, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Dec  1 04:19:09 np0005540741 podman[117047]: 2025-12-01 09:19:09.389778637 +0000 UTC m=+0.152042735 container start 29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_mcclintock, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 04:19:09 np0005540741 podman[117047]: 2025-12-01 09:19:09.393466005 +0000 UTC m=+0.155730113 container attach 29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_mcclintock, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:19:09 np0005540741 python3.9[117170]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]: {
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:        "osd_id": 0,
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:        "type": "bluestore"
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:    },
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:        "osd_id": 1,
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:        "type": "bluestore"
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:    },
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:        "osd_id": 2,
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:        "type": "bluestore"
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]:    }
Dec  1 04:19:10 np0005540741 busy_mcclintock[117111]: }
Dec  1 04:19:10 np0005540741 systemd[1]: libpod-29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e.scope: Deactivated successfully.
Dec  1 04:19:10 np0005540741 systemd[1]: libpod-29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e.scope: Consumed 1.117s CPU time.
Dec  1 04:19:10 np0005540741 podman[117047]: 2025-12-01 09:19:10.507704022 +0000 UTC m=+1.269968120 container died 29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_mcclintock, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec  1 04:19:10 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b2d34f8e41fa0af715a9e6fc58726bed2dd599fe8dd93505ce90d5240b2ea45b-merged.mount: Deactivated successfully.
Dec  1 04:19:10 np0005540741 podman[117047]: 2025-12-01 09:19:10.590458787 +0000 UTC m=+1.352722875 container remove 29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_mcclintock, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:19:10 np0005540741 systemd[1]: libpod-conmon-29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e.scope: Deactivated successfully.
Dec  1 04:19:10 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:19:10 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:19:10 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:19:10 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:19:10 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev e46bacb6-851c-489a-8cc1-7f07d16ef9ea does not exist
Dec  1 04:19:10 np0005540741 python3.9[117351]: ansible-service_facts Invoked
Dec  1 04:19:10 np0005540741 network[117418]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:19:10 np0005540741 network[117424]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:19:10 np0005540741 network[117428]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:19:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v217: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:11 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:19:11 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:19:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:19:12
Dec  1 04:19:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:19:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:19:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'cephfs.cephfs.meta', 'volumes', 'images', 'cephfs.cephfs.data', '.mgr', 'vms']
Dec  1 04:19:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:19:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v218: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:19:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v219: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:15 np0005540741 python3.9[117884]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:19:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v220: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:19:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:19:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:19:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:19:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:19:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:19:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:19:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:19:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:19:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:19:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:19:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:19:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:19:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:19:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:19:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:19:18 np0005540741 python3.9[118037]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec  1 04:19:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v221: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:19 np0005540741 python3.9[118189]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:19:20 np0005540741 python3.9[118267]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:21 np0005540741 python3.9[118419]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:19:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v222: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:21 np0005540741 python3.9[118497]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:22 np0005540741 python3.9[118649]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v223: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:19:23 np0005540741 python3.9[118802]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:19:25 np0005540741 python3.9[118886]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:19:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v224: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:25 np0005540741 systemd-logind[788]: Session 38 logged out. Waiting for processes to exit.
Dec  1 04:19:25 np0005540741 systemd[1]: session-38.scope: Deactivated successfully.
Dec  1 04:19:25 np0005540741 systemd[1]: session-38.scope: Consumed 25.377s CPU time.
Dec  1 04:19:25 np0005540741 systemd-logind[788]: Removed session 38.
Dec  1 04:19:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v225: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:19:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v226: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v227: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:31 np0005540741 systemd-logind[788]: New session 39 of user zuul.
Dec  1 04:19:31 np0005540741 systemd[1]: Started Session 39 of User zuul.
Dec  1 04:19:32 np0005540741 python3.9[119068]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v228: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:19:33 np0005540741 python3.9[119220]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:19:34 np0005540741 python3.9[119298]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:34 np0005540741 systemd[1]: session-39.scope: Deactivated successfully.
Dec  1 04:19:34 np0005540741 systemd[1]: session-39.scope: Consumed 1.729s CPU time.
Dec  1 04:19:34 np0005540741 systemd-logind[788]: Session 39 logged out. Waiting for processes to exit.
Dec  1 04:19:34 np0005540741 systemd-logind[788]: Removed session 39.
Dec  1 04:19:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v229: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v230: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:38 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:19:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v231: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:39 np0005540741 systemd-logind[788]: New session 40 of user zuul.
Dec  1 04:19:39 np0005540741 systemd[1]: Started Session 40 of User zuul.
Dec  1 04:19:41 np0005540741 python3.9[119476]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:19:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v232: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:42 np0005540741 python3.9[119632]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:19:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:19:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:19:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:19:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:19:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:19:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v233: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:43 np0005540741 python3.9[119807]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:19:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:19:43 np0005540741 python3.9[119885]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.7heh9s99 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:44 np0005540741 python3.9[120037]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:19:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v234: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:45 np0005540741 python3.9[120115]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.mc6gfrtd recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:46 np0005540741 python3.9[120267]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:19:46 np0005540741 python3.9[120419]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:19:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v235: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:47 np0005540741 python3.9[120497]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:19:47 np0005540741 python3.9[120649]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:19:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:19:48 np0005540741 python3.9[120727]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:19:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v236: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:49 np0005540741 python3.9[120879]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:50 np0005540741 python3.9[121031]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:19:50 np0005540741 python3.9[121109]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v237: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:51 np0005540741 python3.9[121261]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:19:51 np0005540741 python3.9[121339]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v238: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:53 np0005540741 python3.9[121491]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:19:53 np0005540741 systemd[1]: Reloading.
Dec  1 04:19:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:19:53 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:19:53 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:19:54 np0005540741 python3.9[121681]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:19:55 np0005540741 python3.9[121759]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v239: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:55 np0005540741 python3.9[121911]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:19:56 np0005540741 python3.9[121989]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:57 np0005540741 python3.9[122141]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:19:57 np0005540741 systemd[1]: Reloading.
Dec  1 04:19:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v240: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:19:57 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:19:57 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:19:57 np0005540741 systemd[1]: Starting Create netns directory...
Dec  1 04:19:57 np0005540741 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  1 04:19:57 np0005540741 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  1 04:19:57 np0005540741 systemd[1]: Finished Create netns directory.
Dec  1 04:19:58 np0005540741 python3.9[122333]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:19:58 np0005540741 network[122350]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:19:58 np0005540741 network[122351]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:19:58 np0005540741 network[122352]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:19:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:19:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v241: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v242: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v243: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:03 np0005540741 python3.9[122614]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:20:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:20:03 np0005540741 python3.9[122692]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:04 np0005540741 python3.9[122844]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v244: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:05 np0005540741 python3.9[122996]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:20:05 np0005540741 python3.9[123074]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:06 np0005540741 python3.9[123226]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  1 04:20:06 np0005540741 systemd[1]: Starting Time & Date Service...
Dec  1 04:20:06 np0005540741 systemd[1]: Started Time & Date Service.
Dec  1 04:20:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v245: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:07 np0005540741 python3.9[123382]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:20:08 np0005540741 python3.9[123534]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:20:09 np0005540741 python3.9[123612]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v246: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:09 np0005540741 python3.9[123764]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:20:10 np0005540741 python3.9[123842]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.fe4u1wl8 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:11 np0005540741 python3.9[124042]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:20:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v247: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:20:11 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:20:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:20:11 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:20:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:20:11 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:20:11 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev f360f47f-3d94-438a-9c40-240009d896ed does not exist
Dec  1 04:20:11 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 9254ea8c-1cab-4c87-97c9-c7db68f71411 does not exist
Dec  1 04:20:11 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 590c3f9e-41e3-4055-85ed-67f9c2c7779a does not exist
Dec  1 04:20:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:20:11 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:20:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:20:11 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:20:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:20:11 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:20:11 np0005540741 python3.9[124192]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:12 np0005540741 podman[124422]: 2025-12-01 09:20:12.319445618 +0000 UTC m=+0.090495113 container create f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_banach, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:20:12 np0005540741 podman[124422]: 2025-12-01 09:20:12.255271428 +0000 UTC m=+0.026320943 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:20:12 np0005540741 systemd[1]: Started libpod-conmon-f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364.scope.
Dec  1 04:20:12 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:20:12 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:20:12 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:20:12 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:20:12 np0005540741 podman[124422]: 2025-12-01 09:20:12.416123956 +0000 UTC m=+0.187173501 container init f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_banach, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:20:12 np0005540741 podman[124422]: 2025-12-01 09:20:12.425022354 +0000 UTC m=+0.196071869 container start f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_banach, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Dec  1 04:20:12 np0005540741 podman[124422]: 2025-12-01 09:20:12.430032354 +0000 UTC m=+0.201081879 container attach f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_banach, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:20:12 np0005540741 naughty_banach[124439]: 167 167
Dec  1 04:20:12 np0005540741 systemd[1]: libpod-f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364.scope: Deactivated successfully.
Dec  1 04:20:12 np0005540741 conmon[124439]: conmon f6e6ae89457b70a295ef <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364.scope/container/memory.events
Dec  1 04:20:12 np0005540741 podman[124422]: 2025-12-01 09:20:12.433606762 +0000 UTC m=+0.204656287 container died f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_banach, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec  1 04:20:12 np0005540741 systemd[1]: var-lib-containers-storage-overlay-cce5b5e9d3886810a25a67700d3bd985a1ff3815a76fb42e8ead49f5bd94fded-merged.mount: Deactivated successfully.
Dec  1 04:20:12 np0005540741 podman[124422]: 2025-12-01 09:20:12.487790452 +0000 UTC m=+0.258839957 container remove f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_banach, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, ceph=True)
Dec  1 04:20:12 np0005540741 systemd[1]: libpod-conmon-f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364.scope: Deactivated successfully.
Dec  1 04:20:12 np0005540741 podman[124537]: 2025-12-01 09:20:12.686765167 +0000 UTC m=+0.076916725 container create 4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bhabha, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec  1 04:20:12 np0005540741 systemd[1]: Started libpod-conmon-4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03.scope.
Dec  1 04:20:12 np0005540741 podman[124537]: 2025-12-01 09:20:12.658549428 +0000 UTC m=+0.048701016 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:20:12 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:20:12 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68aca2e2454b2770cf6e02cde0af8204f9eef16810bc562776ff0addb8defc75/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:20:12 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68aca2e2454b2770cf6e02cde0af8204f9eef16810bc562776ff0addb8defc75/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:20:12 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68aca2e2454b2770cf6e02cde0af8204f9eef16810bc562776ff0addb8defc75/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:20:12 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68aca2e2454b2770cf6e02cde0af8204f9eef16810bc562776ff0addb8defc75/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:20:12 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68aca2e2454b2770cf6e02cde0af8204f9eef16810bc562776ff0addb8defc75/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:20:12 np0005540741 podman[124537]: 2025-12-01 09:20:12.790116485 +0000 UTC m=+0.180268063 container init 4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bhabha, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:20:12 np0005540741 podman[124537]: 2025-12-01 09:20:12.79890204 +0000 UTC m=+0.189053598 container start 4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bhabha, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec  1 04:20:12 np0005540741 podman[124537]: 2025-12-01 09:20:12.802792517 +0000 UTC m=+0.192944095 container attach 4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bhabha, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:20:12 np0005540741 python3.9[124538]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:20:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:20:12
Dec  1 04:20:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:20:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:20:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['cephfs.cephfs.data', 'vms', 'images', 'volumes', 'cephfs.cephfs.meta', 'backups', '.mgr']
Dec  1 04:20:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:20:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v248: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:20:13 np0005540741 python3[124720]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  1 04:20:14 np0005540741 elegant_bhabha[124555]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:20:14 np0005540741 elegant_bhabha[124555]: --> relative data size: 1.0
Dec  1 04:20:14 np0005540741 elegant_bhabha[124555]: --> All data devices are unavailable
Dec  1 04:20:14 np0005540741 systemd[1]: libpod-4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03.scope: Deactivated successfully.
Dec  1 04:20:14 np0005540741 systemd[1]: libpod-4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03.scope: Consumed 1.167s CPU time.
Dec  1 04:20:14 np0005540741 podman[124537]: 2025-12-01 09:20:14.04164241 +0000 UTC m=+1.431793978 container died 4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bhabha, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:20:14 np0005540741 systemd[1]: var-lib-containers-storage-overlay-68aca2e2454b2770cf6e02cde0af8204f9eef16810bc562776ff0addb8defc75-merged.mount: Deactivated successfully.
Dec  1 04:20:14 np0005540741 podman[124537]: 2025-12-01 09:20:14.106918943 +0000 UTC m=+1.497070491 container remove 4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bhabha, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:20:14 np0005540741 systemd[1]: libpod-conmon-4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03.scope: Deactivated successfully.
Dec  1 04:20:14 np0005540741 python3.9[125002]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:20:14 np0005540741 podman[125066]: 2025-12-01 09:20:14.762514243 +0000 UTC m=+0.050172210 container create d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_brahmagupta, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:20:14 np0005540741 systemd[1]: Started libpod-conmon-d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da.scope.
Dec  1 04:20:14 np0005540741 podman[125066]: 2025-12-01 09:20:14.737349016 +0000 UTC m=+0.025006993 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:20:14 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:20:14 np0005540741 podman[125066]: 2025-12-01 09:20:14.850150949 +0000 UTC m=+0.137808936 container init d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_brahmagupta, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec  1 04:20:14 np0005540741 podman[125066]: 2025-12-01 09:20:14.857046036 +0000 UTC m=+0.144704003 container start d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_brahmagupta, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec  1 04:20:14 np0005540741 podman[125066]: 2025-12-01 09:20:14.861802609 +0000 UTC m=+0.149460576 container attach d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_brahmagupta, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:20:14 np0005540741 festive_brahmagupta[125108]: 167 167
Dec  1 04:20:14 np0005540741 systemd[1]: libpod-d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da.scope: Deactivated successfully.
Dec  1 04:20:14 np0005540741 podman[125066]: 2025-12-01 09:20:14.865060187 +0000 UTC m=+0.152718154 container died d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_brahmagupta, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:20:14 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b8a0cb42acf3c562e529c0261f67c26e99c71779a7411db556eaa16900e87cef-merged.mount: Deactivated successfully.
Dec  1 04:20:15 np0005540741 podman[125066]: 2025-12-01 09:20:15.024464082 +0000 UTC m=+0.312122049 container remove d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_brahmagupta, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec  1 04:20:15 np0005540741 systemd[1]: libpod-conmon-d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da.scope: Deactivated successfully.
Dec  1 04:20:15 np0005540741 python3.9[125142]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v249: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:15 np0005540741 podman[125167]: 2025-12-01 09:20:15.195658311 +0000 UTC m=+0.048917672 container create d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_faraday, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec  1 04:20:15 np0005540741 systemd[1]: Started libpod-conmon-d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5.scope.
Dec  1 04:20:15 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:20:15 np0005540741 podman[125167]: 2025-12-01 09:20:15.177152505 +0000 UTC m=+0.030411886 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:20:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d3f2e9583192df983eef63b4e4f1b0eddd028131eea98f0b921ba47f1f0c7fb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:20:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d3f2e9583192df983eef63b4e4f1b0eddd028131eea98f0b921ba47f1f0c7fb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:20:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d3f2e9583192df983eef63b4e4f1b0eddd028131eea98f0b921ba47f1f0c7fb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:20:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d3f2e9583192df983eef63b4e4f1b0eddd028131eea98f0b921ba47f1f0c7fb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:20:15 np0005540741 podman[125167]: 2025-12-01 09:20:15.288475032 +0000 UTC m=+0.141734413 container init d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_faraday, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:20:15 np0005540741 podman[125167]: 2025-12-01 09:20:15.297121272 +0000 UTC m=+0.150380633 container start d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_faraday, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Dec  1 04:20:15 np0005540741 podman[125167]: 2025-12-01 09:20:15.301347359 +0000 UTC m=+0.154606720 container attach d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_faraday, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec  1 04:20:15 np0005540741 python3.9[125334]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]: {
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:    "0": [
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:        {
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "devices": [
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "/dev/loop3"
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            ],
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "lv_name": "ceph_lv0",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "lv_size": "21470642176",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "name": "ceph_lv0",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "tags": {
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.cluster_name": "ceph",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.crush_device_class": "",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.encrypted": "0",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.osd_id": "0",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.type": "block",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.vdo": "0"
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            },
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "type": "block",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "vg_name": "ceph_vg0"
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:        }
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:    ],
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:    "1": [
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:        {
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "devices": [
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "/dev/loop4"
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            ],
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "lv_name": "ceph_lv1",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "lv_size": "21470642176",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "name": "ceph_lv1",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "tags": {
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.cluster_name": "ceph",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.crush_device_class": "",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.encrypted": "0",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.osd_id": "1",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.type": "block",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.vdo": "0"
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            },
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "type": "block",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "vg_name": "ceph_vg1"
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:        }
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:    ],
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:    "2": [
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:        {
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "devices": [
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "/dev/loop5"
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            ],
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "lv_name": "ceph_lv2",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "lv_size": "21470642176",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "name": "ceph_lv2",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "tags": {
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.cluster_name": "ceph",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.crush_device_class": "",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.encrypted": "0",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.osd_id": "2",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.type": "block",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:                "ceph.vdo": "0"
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            },
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "type": "block",
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:            "vg_name": "ceph_vg2"
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:        }
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]:    ]
Dec  1 04:20:16 np0005540741 jolly_faraday[125207]: }
Dec  1 04:20:16 np0005540741 systemd[1]: libpod-d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5.scope: Deactivated successfully.
Dec  1 04:20:16 np0005540741 podman[125167]: 2025-12-01 09:20:16.096639401 +0000 UTC m=+0.949898822 container died d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_faraday, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:20:16 np0005540741 systemd[1]: var-lib-containers-storage-overlay-6d3f2e9583192df983eef63b4e4f1b0eddd028131eea98f0b921ba47f1f0c7fb-merged.mount: Deactivated successfully.
Dec  1 04:20:16 np0005540741 podman[125167]: 2025-12-01 09:20:16.188638188 +0000 UTC m=+1.041897579 container remove d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_faraday, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec  1 04:20:16 np0005540741 systemd[1]: libpod-conmon-d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5.scope: Deactivated successfully.
Dec  1 04:20:16 np0005540741 python3.9[125422]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:16 np0005540741 podman[125701]: 2025-12-01 09:20:16.927203194 +0000 UTC m=+0.045442878 container create c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_payne, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec  1 04:20:16 np0005540741 systemd[1]: Started libpod-conmon-c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1.scope.
Dec  1 04:20:16 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:20:17 np0005540741 podman[125701]: 2025-12-01 09:20:16.90815552 +0000 UTC m=+0.026395234 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:20:17 np0005540741 podman[125701]: 2025-12-01 09:20:17.011773727 +0000 UTC m=+0.130013491 container init c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_payne, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec  1 04:20:17 np0005540741 podman[125701]: 2025-12-01 09:20:17.021762998 +0000 UTC m=+0.140002682 container start c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_payne, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:20:17 np0005540741 podman[125701]: 2025-12-01 09:20:17.025779489 +0000 UTC m=+0.144019273 container attach c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_payne, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Dec  1 04:20:17 np0005540741 compassionate_payne[125735]: 167 167
Dec  1 04:20:17 np0005540741 systemd[1]: libpod-c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1.scope: Deactivated successfully.
Dec  1 04:20:17 np0005540741 podman[125701]: 2025-12-01 09:20:17.030376817 +0000 UTC m=+0.148616501 container died c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_payne, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef)
Dec  1 04:20:17 np0005540741 systemd[1]: var-lib-containers-storage-overlay-3ffaabc829525d83e468e26cf13f00583e02a221b7c5b5fcbb9c258033e16f40-merged.mount: Deactivated successfully.
Dec  1 04:20:17 np0005540741 podman[125701]: 2025-12-01 09:20:17.066488893 +0000 UTC m=+0.184728577 container remove c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_payne, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:20:17 np0005540741 systemd[1]: libpod-conmon-c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1.scope: Deactivated successfully.
Dec  1 04:20:17 np0005540741 python3.9[125732]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:20:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v250: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:17 np0005540741 podman[125761]: 2025-12-01 09:20:17.227663761 +0000 UTC m=+0.046712106 container create dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_beaver, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:20:17 np0005540741 systemd[1]: Started libpod-conmon-dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7.scope.
Dec  1 04:20:17 np0005540741 podman[125761]: 2025-12-01 09:20:17.206560316 +0000 UTC m=+0.025608681 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:20:17 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:20:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d09fbd0dd65100cfc9b462d043041bd1b849ca951a95d9f9822235e7673631aa/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:20:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d09fbd0dd65100cfc9b462d043041bd1b849ca951a95d9f9822235e7673631aa/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:20:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d09fbd0dd65100cfc9b462d043041bd1b849ca951a95d9f9822235e7673631aa/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:20:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d09fbd0dd65100cfc9b462d043041bd1b849ca951a95d9f9822235e7673631aa/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:20:17 np0005540741 podman[125761]: 2025-12-01 09:20:17.323175734 +0000 UTC m=+0.142224109 container init dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_beaver, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec  1 04:20:17 np0005540741 podman[125761]: 2025-12-01 09:20:17.338185306 +0000 UTC m=+0.157233681 container start dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_beaver, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec  1 04:20:17 np0005540741 podman[125761]: 2025-12-01 09:20:17.343010381 +0000 UTC m=+0.162058746 container attach dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_beaver, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:20:17 np0005540741 python3.9[125858]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:18 np0005540741 python3.9[126018]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]: {
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:        "osd_id": 0,
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:        "type": "bluestore"
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:    },
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:        "osd_id": 1,
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:        "type": "bluestore"
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:    },
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:        "osd_id": 2,
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:        "type": "bluestore"
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]:    }
Dec  1 04:20:18 np0005540741 admiring_beaver[125801]: }
Dec  1 04:20:18 np0005540741 systemd[1]: libpod-dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7.scope: Deactivated successfully.
Dec  1 04:20:18 np0005540741 systemd[1]: libpod-dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7.scope: Consumed 1.063s CPU time.
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:20:18 np0005540741 podman[126041]: 2025-12-01 09:20:18.44342973 +0000 UTC m=+0.031968193 container died dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_beaver, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:20:18 np0005540741 systemd[1]: var-lib-containers-storage-overlay-d09fbd0dd65100cfc9b462d043041bd1b849ca951a95d9f9822235e7673631aa-merged.mount: Deactivated successfully.
Dec  1 04:20:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:20:18 np0005540741 podman[126041]: 2025-12-01 09:20:18.518654413 +0000 UTC m=+0.107192796 container remove dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_beaver, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec  1 04:20:18 np0005540741 systemd[1]: libpod-conmon-dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7.scope: Deactivated successfully.
Dec  1 04:20:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:20:18 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:20:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:20:18 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:20:18 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev f23033d6-fc11-4a22-bb61-d739105305cd does not exist
Dec  1 04:20:18 np0005540741 python3.9[126134]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v251: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:19 np0005540741 python3.9[126333]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:20:19 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:20:19 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:20:20 np0005540741 python3.9[126411]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:20 np0005540741 python3.9[126563]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:20:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v252: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:21 np0005540741 python3.9[126718]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:22 np0005540741 python3.9[126870]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v253: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:23 np0005540741 python3.9[127022]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:20:24 np0005540741 python3.9[127174]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  1 04:20:24 np0005540741 python3.9[127326]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  1 04:20:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v254: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:25 np0005540741 systemd[1]: session-40.scope: Deactivated successfully.
Dec  1 04:20:25 np0005540741 systemd[1]: session-40.scope: Consumed 34.192s CPU time.
Dec  1 04:20:25 np0005540741 systemd-logind[788]: Session 40 logged out. Waiting for processes to exit.
Dec  1 04:20:25 np0005540741 systemd-logind[788]: Removed session 40.
Dec  1 04:20:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v255: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:20:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v256: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v257: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:31 np0005540741 systemd-logind[788]: New session 41 of user zuul.
Dec  1 04:20:31 np0005540741 systemd[1]: Started Session 41 of User zuul.
Dec  1 04:20:32 np0005540741 python3.9[127507]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec  1 04:20:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v258: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:20:33 np0005540741 python3.9[127659]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:20:34 np0005540741 python3.9[127813]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Dec  1 04:20:35 np0005540741 python3.9[127965]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.in05qk5a follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:20:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v259: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:35 np0005540741 python3.9[128090]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.in05qk5a mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580834.623237-44-178067481465054/.source.in05qk5a _original_basename=.1ewlnk6z follow=False checksum=2242aa230a3299b7aca23dfd6feceb6f43ae540f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:36 np0005540741 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  1 04:20:37 np0005540741 python3.9[128242]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:20:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v260: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:38 np0005540741 python3.9[128396]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDRTxmAPcz2eFUCrQOAknLp4ibCvALuiJ7iA+ICPT8Mpd8XYcXDdZBZjlSgWd0U+d6qvFNYaJ4Kq/cNnxeSVMCkpQCGri3TTRfaS9L5COiCf0cmBNheHZSQL0uZLjKzjeaIyGWH6HdOA7KUsCK2YT/Iyf0OJzrBs5vhWuzbSXsCjsHTSzR+XxRX3C/ImHAtccLwxysUhm6H4CGIPn0bY/YGgoRkJUvouHT/4kSxhQrtFAKJOWlJ01d3tdISKrGa+SiKU6zq4yCgT5yeSsMSRyP+L06UuH7Htv2BSPXmTFLy8alJrAKLo19SllAr6m5ZP3OWy9eRDvp+oa4ZA3J9JX+isLwhjDkF1Q+aes+99JQ6E7W5hL8qvDAHCwaKgIo1IRMHJEVvZNsKqn+ME9EBDD1WyTNzik/qEOj2Cr9TXxmps8zD0VcngBAhdAv39R6EAPnVfRf1Goyagp6gPsCOeulh58jgrvAZ7L89u1J5yZY4C2Cu9js9UJwp46pdgU5qDDM=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILCzsFh+ZK0hqueDU2gWvb+j6m7hD/RYc8+thzHnJPmj#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN8lUi9ZvyyCZ7KdPvA7WBYtjDR8VhQzZuiukEvvpoRp0UJKIzVf11cXzP5sRkLnexUeWiXTv+jZK8hoAN9Othc=#012 create=True mode=0644 path=/tmp/ansible.in05qk5a state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:38 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:20:38 np0005540741 python3.9[128548]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.in05qk5a' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:20:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v261: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:39 np0005540741 python3.9[128702]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.in05qk5a state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:40 np0005540741 systemd[1]: session-41.scope: Deactivated successfully.
Dec  1 04:20:40 np0005540741 systemd[1]: session-41.scope: Consumed 5.662s CPU time.
Dec  1 04:20:40 np0005540741 systemd-logind[788]: Session 41 logged out. Waiting for processes to exit.
Dec  1 04:20:40 np0005540741 systemd-logind[788]: Removed session 41.
Dec  1 04:20:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v262: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:20:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:20:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:20:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:20:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:20:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:20:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v263: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:20:43 np0005540741 systemd[1]: session-18.scope: Deactivated successfully.
Dec  1 04:20:43 np0005540741 systemd[1]: session-18.scope: Consumed 1min 27.611s CPU time.
Dec  1 04:20:43 np0005540741 systemd-logind[788]: Session 18 logged out. Waiting for processes to exit.
Dec  1 04:20:43 np0005540741 systemd-logind[788]: Removed session 18.
Dec  1 04:20:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v264: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:45 np0005540741 systemd-logind[788]: New session 42 of user zuul.
Dec  1 04:20:45 np0005540741 systemd[1]: Started Session 42 of User zuul.
Dec  1 04:20:46 np0005540741 python3.9[128881]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:20:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v265: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:48 np0005540741 python3.9[129037]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  1 04:20:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:20:48 np0005540741 python3.9[129191]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:20:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v266: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:49 np0005540741 python3.9[129344]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:20:50 np0005540741 python3.9[129497]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:20:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v267: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:51 np0005540741 python3.9[129649]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:20:52 np0005540741 systemd[1]: session-42.scope: Deactivated successfully.
Dec  1 04:20:52 np0005540741 systemd[1]: session-42.scope: Consumed 4.273s CPU time.
Dec  1 04:20:52 np0005540741 systemd-logind[788]: Session 42 logged out. Waiting for processes to exit.
Dec  1 04:20:52 np0005540741 systemd-logind[788]: Removed session 42.
Dec  1 04:20:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v268: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #18. Immutable memtables: 0.
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.566451) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 18
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580854566563, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 6560, "num_deletes": 251, "total_data_size": 7020721, "memory_usage": 7223840, "flush_reason": "Manual Compaction"}
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #19: started
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580854605564, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 19, "file_size": 5332367, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 132, "largest_seqno": 6689, "table_properties": {"data_size": 5309824, "index_size": 14365, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7109, "raw_key_size": 62773, "raw_average_key_size": 22, "raw_value_size": 5256505, "raw_average_value_size": 1863, "num_data_blocks": 647, "num_entries": 2821, "num_filter_entries": 2821, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580342, "oldest_key_time": 1764580342, "file_creation_time": 1764580854, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 19, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 39178 microseconds, and 17125 cpu microseconds.
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.605633) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #19: 5332367 bytes OK
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.605668) [db/memtable_list.cc:519] [default] Level-0 commit table #19 started
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.607044) [db/memtable_list.cc:722] [default] Level-0 commit table #19: memtable #1 done
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.607062) EVENT_LOG_v1 {"time_micros": 1764580854607057, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [3, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.607091) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[3 0 0 0 0 0 0] max score 0.75
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 6993030, prev total WAL file size 6993030, number of live WAL files 2.
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000014.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.609258) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 3@0 files to L6, score -1.00
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [19(5207KB) 13(50KB) 8(1944B)]
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580854609527, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [19, 13, 8], "score": -1, "input_data_size": 5386105, "oldest_snapshot_seqno": -1}
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #20: 2633 keys, 5343204 bytes, temperature: kUnknown
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580854659891, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 20, "file_size": 5343204, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5321116, "index_size": 14427, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 6597, "raw_key_size": 60750, "raw_average_key_size": 23, "raw_value_size": 5269488, "raw_average_value_size": 2001, "num_data_blocks": 649, "num_entries": 2633, "num_filter_entries": 2633, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764580854, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.660217) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 3@0 files to L6 => 5343204 bytes
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.661662) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 106.7 rd, 105.9 wr, level 6, files in(3, 0) out(1 +0 blob) MB in(5.1, 0.0 +0.0 blob) out(5.1 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 2922, records dropped: 289 output_compression: NoCompression
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.661700) EVENT_LOG_v1 {"time_micros": 1764580854661681, "job": 4, "event": "compaction_finished", "compaction_time_micros": 50469, "compaction_time_cpu_micros": 29139, "output_level": 6, "num_output_files": 1, "total_output_size": 5343204, "num_input_records": 2922, "num_output_records": 2633, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000019.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580854664698, "job": 4, "event": "table_file_deletion", "file_number": 19}
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000013.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580854664812, "job": 4, "event": "table_file_deletion", "file_number": 13}
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580854664880, "job": 4, "event": "table_file_deletion", "file_number": 8}
Dec  1 04:20:54 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.609016) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:20:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v269: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v270: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:58 np0005540741 systemd-logind[788]: New session 43 of user zuul.
Dec  1 04:20:58 np0005540741 systemd[1]: Started Session 43 of User zuul.
Dec  1 04:20:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:20:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v271: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:20:59 np0005540741 python3.9[129828]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:21:00 np0005540741 python3.9[129984]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:21:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v272: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:01 np0005540741 python3.9[130068]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  1 04:21:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v273: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:21:03 np0005540741 python3.9[130219]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:21:05 np0005540741 python3.9[130370]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 04:21:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v274: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:05 np0005540741 python3.9[130520]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:21:06 np0005540741 python3.9[130670]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:21:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v275: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:07 np0005540741 systemd[1]: session-43.scope: Deactivated successfully.
Dec  1 04:21:07 np0005540741 systemd[1]: session-43.scope: Consumed 6.806s CPU time.
Dec  1 04:21:07 np0005540741 systemd-logind[788]: Session 43 logged out. Waiting for processes to exit.
Dec  1 04:21:07 np0005540741 systemd-logind[788]: Removed session 43.
Dec  1 04:21:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:21:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v276: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v277: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:12 np0005540741 systemd-logind[788]: New session 44 of user zuul.
Dec  1 04:21:12 np0005540741 systemd[1]: Started Session 44 of User zuul.
Dec  1 04:21:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:21:12
Dec  1 04:21:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:21:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:21:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['cephfs.cephfs.meta', 'vms', 'backups', 'images', '.mgr', 'cephfs.cephfs.data', 'volumes']
Dec  1 04:21:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:21:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v278: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:21:13 np0005540741 python3.9[130848]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:21:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v279: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:15 np0005540741 python3.9[131004]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:21:16 np0005540741 python3.9[131156]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:21:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v280: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:17 np0005540741 python3.9[131308]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:18 np0005540741 python3.9[131431]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580876.7077625-65-248268948263347/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=d66b669105720ca8abb42d3b5b02733184f83aa9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:21:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:21:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:21:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:21:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:21:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:21:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:21:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:21:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:21:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:21:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:21:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:21:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:21:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:21:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:21:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:21:19 np0005540741 python3.9[131613]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v281: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:21:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:21:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:21:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:21:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:21:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:21:19 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 539a95bf-07d9-4bab-a25b-c5f8ba7b37b9 does not exist
Dec  1 04:21:19 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev b48f8752-d89b-4b48-ac28-ed31ce7d11d5 does not exist
Dec  1 04:21:19 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev fe6b68ed-a32c-418a-a387-c5d1ec2acbce does not exist
Dec  1 04:21:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:21:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:21:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:21:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:21:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:21:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:21:19 np0005540741 python3.9[131837]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580878.5105698-65-42796182260776/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=08ea373ef3298387cada856fae069f7192f4a06e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:19 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:21:19 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:21:19 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:21:20 np0005540741 podman[132125]: 2025-12-01 09:21:20.130513924 +0000 UTC m=+0.051502481 container create 20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_davinci, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:21:20 np0005540741 systemd[1]: Started libpod-conmon-20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65.scope.
Dec  1 04:21:20 np0005540741 podman[132125]: 2025-12-01 09:21:20.110423157 +0000 UTC m=+0.031411744 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:21:20 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:21:20 np0005540741 podman[132125]: 2025-12-01 09:21:20.226862474 +0000 UTC m=+0.147851061 container init 20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_davinci, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec  1 04:21:20 np0005540741 podman[132125]: 2025-12-01 09:21:20.240468765 +0000 UTC m=+0.161457332 container start 20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_davinci, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec  1 04:21:20 np0005540741 podman[132125]: 2025-12-01 09:21:20.244891982 +0000 UTC m=+0.165880569 container attach 20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_davinci, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec  1 04:21:20 np0005540741 keen_davinci[132147]: 167 167
Dec  1 04:21:20 np0005540741 systemd[1]: libpod-20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65.scope: Deactivated successfully.
Dec  1 04:21:20 np0005540741 podman[132125]: 2025-12-01 09:21:20.253901111 +0000 UTC m=+0.174889668 container died 20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_davinci, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Dec  1 04:21:20 np0005540741 systemd[1]: var-lib-containers-storage-overlay-7d36b12b6834588e3b147be31070e7e51c80bb87f496efd43eb80000371519a5-merged.mount: Deactivated successfully.
Dec  1 04:21:20 np0005540741 python3.9[132139]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:20 np0005540741 podman[132125]: 2025-12-01 09:21:20.293942622 +0000 UTC m=+0.214931179 container remove 20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_davinci, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:21:20 np0005540741 systemd[1]: libpod-conmon-20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65.scope: Deactivated successfully.
Dec  1 04:21:20 np0005540741 podman[132193]: 2025-12-01 09:21:20.464053063 +0000 UTC m=+0.052866491 container create d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_proskuriakova, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:21:20 np0005540741 systemd[1]: Started libpod-conmon-d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed.scope.
Dec  1 04:21:20 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:21:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73244b3d473a99837e152b8ce8b92c35256bce656eed056fd9144e9f192bf71a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:21:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73244b3d473a99837e152b8ce8b92c35256bce656eed056fd9144e9f192bf71a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:21:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73244b3d473a99837e152b8ce8b92c35256bce656eed056fd9144e9f192bf71a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:21:20 np0005540741 podman[132193]: 2025-12-01 09:21:20.438384675 +0000 UTC m=+0.027198093 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:21:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73244b3d473a99837e152b8ce8b92c35256bce656eed056fd9144e9f192bf71a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:21:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73244b3d473a99837e152b8ce8b92c35256bce656eed056fd9144e9f192bf71a/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:21:20 np0005540741 podman[132193]: 2025-12-01 09:21:20.547537792 +0000 UTC m=+0.136351200 container init d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_proskuriakova, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec  1 04:21:20 np0005540741 podman[132193]: 2025-12-01 09:21:20.554023649 +0000 UTC m=+0.142837067 container start d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_proskuriakova, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:21:20 np0005540741 podman[132193]: 2025-12-01 09:21:20.560435913 +0000 UTC m=+0.149249371 container attach d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_proskuriakova, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:21:21 np0005540741 python3.9[132312]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580879.8078876-65-126916532795187/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=e936f8afd9f4f5c6814ddbf37b17bc0751f9c27f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v282: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:21 np0005540741 nervous_proskuriakova[132255]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:21:21 np0005540741 nervous_proskuriakova[132255]: --> relative data size: 1.0
Dec  1 04:21:21 np0005540741 nervous_proskuriakova[132255]: --> All data devices are unavailable
Dec  1 04:21:21 np0005540741 systemd[1]: libpod-d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed.scope: Deactivated successfully.
Dec  1 04:21:21 np0005540741 systemd[1]: libpod-d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed.scope: Consumed 1.091s CPU time.
Dec  1 04:21:21 np0005540741 podman[132193]: 2025-12-01 09:21:21.716239999 +0000 UTC m=+1.305053437 container died d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_proskuriakova, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec  1 04:21:21 np0005540741 systemd[1]: var-lib-containers-storage-overlay-73244b3d473a99837e152b8ce8b92c35256bce656eed056fd9144e9f192bf71a-merged.mount: Deactivated successfully.
Dec  1 04:21:21 np0005540741 podman[132193]: 2025-12-01 09:21:21.787613991 +0000 UTC m=+1.376427399 container remove d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_proskuriakova, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec  1 04:21:21 np0005540741 systemd[1]: libpod-conmon-d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed.scope: Deactivated successfully.
Dec  1 04:21:21 np0005540741 python3.9[132484]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:21:22 np0005540741 podman[132792]: 2025-12-01 09:21:22.440890591 +0000 UTC m=+0.051424829 container create 24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_hopper, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:21:22 np0005540741 systemd[1]: Started libpod-conmon-24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca.scope.
Dec  1 04:21:22 np0005540741 python3.9[132777]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:21:22 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:21:22 np0005540741 podman[132792]: 2025-12-01 09:21:22.420264698 +0000 UTC m=+0.030798976 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:21:22 np0005540741 podman[132792]: 2025-12-01 09:21:22.528349325 +0000 UTC m=+0.138883593 container init 24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_hopper, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  1 04:21:22 np0005540741 podman[132792]: 2025-12-01 09:21:22.535868791 +0000 UTC m=+0.146403039 container start 24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_hopper, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec  1 04:21:22 np0005540741 podman[132792]: 2025-12-01 09:21:22.540046391 +0000 UTC m=+0.150580629 container attach 24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_hopper, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:21:22 np0005540741 eloquent_hopper[132808]: 167 167
Dec  1 04:21:22 np0005540741 systemd[1]: libpod-24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca.scope: Deactivated successfully.
Dec  1 04:21:22 np0005540741 podman[132792]: 2025-12-01 09:21:22.542608785 +0000 UTC m=+0.153143033 container died 24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_hopper, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  1 04:21:22 np0005540741 systemd[1]: var-lib-containers-storage-overlay-f98b2a5458889717d101f816b7e8c68926f47ad83408dd4226c34ddea35b87da-merged.mount: Deactivated successfully.
Dec  1 04:21:22 np0005540741 podman[132792]: 2025-12-01 09:21:22.631346576 +0000 UTC m=+0.241880844 container remove 24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_hopper, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:21:22 np0005540741 systemd[1]: libpod-conmon-24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca.scope: Deactivated successfully.
Dec  1 04:21:22 np0005540741 podman[132897]: 2025-12-01 09:21:22.84925793 +0000 UTC m=+0.055879637 container create 1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_hawking, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 04:21:22 np0005540741 systemd[1]: Started libpod-conmon-1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533.scope.
Dec  1 04:21:22 np0005540741 podman[132897]: 2025-12-01 09:21:22.826278329 +0000 UTC m=+0.032900046 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:21:22 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:21:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89a673cc04a6dc9b95e247967adbb3267968a74c3c7ed3b38c6558a83dd2bbb6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:21:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89a673cc04a6dc9b95e247967adbb3267968a74c3c7ed3b38c6558a83dd2bbb6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:21:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89a673cc04a6dc9b95e247967adbb3267968a74c3c7ed3b38c6558a83dd2bbb6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:21:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89a673cc04a6dc9b95e247967adbb3267968a74c3c7ed3b38c6558a83dd2bbb6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:21:22 np0005540741 podman[132897]: 2025-12-01 09:21:22.949611765 +0000 UTC m=+0.156233532 container init 1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_hawking, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec  1 04:21:22 np0005540741 podman[132897]: 2025-12-01 09:21:22.961502817 +0000 UTC m=+0.168124474 container start 1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_hawking, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:21:22 np0005540741 podman[132897]: 2025-12-01 09:21:22.965150082 +0000 UTC m=+0.171771849 container attach 1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_hawking, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:21:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v283: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:23 np0005540741 python3.9[133005]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]: {
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:    "0": [
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:        {
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "devices": [
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "/dev/loop3"
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            ],
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "lv_name": "ceph_lv0",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "lv_size": "21470642176",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "name": "ceph_lv0",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "tags": {
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.cluster_name": "ceph",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.crush_device_class": "",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.encrypted": "0",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.osd_id": "0",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.type": "block",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.vdo": "0"
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            },
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "type": "block",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "vg_name": "ceph_vg0"
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:        }
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:    ],
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:    "1": [
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:        {
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "devices": [
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "/dev/loop4"
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            ],
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "lv_name": "ceph_lv1",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "lv_size": "21470642176",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "name": "ceph_lv1",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "tags": {
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.cluster_name": "ceph",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.crush_device_class": "",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.encrypted": "0",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.osd_id": "1",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.type": "block",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.vdo": "0"
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            },
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "type": "block",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "vg_name": "ceph_vg1"
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:        }
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:    ],
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:    "2": [
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:        {
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "devices": [
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "/dev/loop5"
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            ],
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "lv_name": "ceph_lv2",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "lv_size": "21470642176",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "name": "ceph_lv2",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "tags": {
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.cluster_name": "ceph",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.crush_device_class": "",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.encrypted": "0",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.osd_id": "2",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.type": "block",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:                "ceph.vdo": "0"
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            },
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "type": "block",
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:            "vg_name": "ceph_vg2"
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:        }
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]:    ]
Dec  1 04:21:23 np0005540741 wizardly_hawking[132948]: }
Dec  1 04:21:23 np0005540741 systemd[1]: libpod-1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533.scope: Deactivated successfully.
Dec  1 04:21:23 np0005540741 podman[132897]: 2025-12-01 09:21:23.843428269 +0000 UTC m=+1.050049986 container died 1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_hawking, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:21:23 np0005540741 systemd[1]: var-lib-containers-storage-overlay-89a673cc04a6dc9b95e247967adbb3267968a74c3c7ed3b38c6558a83dd2bbb6-merged.mount: Deactivated successfully.
Dec  1 04:21:23 np0005540741 podman[132897]: 2025-12-01 09:21:23.918073834 +0000 UTC m=+1.124695501 container remove 1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_hawking, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 04:21:23 np0005540741 python3.9[133130]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580882.7331283-124-17564990288686/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=d62b0903a77dc1bc7a4454e4946f7491a05b6027 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:23 np0005540741 systemd[1]: libpod-conmon-1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533.scope: Deactivated successfully.
Dec  1 04:21:24 np0005540741 python3.9[133408]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:24 np0005540741 podman[133437]: 2025-12-01 09:21:24.607774101 +0000 UTC m=+0.053658303 container create ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_roentgen, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  1 04:21:24 np0005540741 systemd[1]: Started libpod-conmon-ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a.scope.
Dec  1 04:21:24 np0005540741 podman[133437]: 2025-12-01 09:21:24.58198953 +0000 UTC m=+0.027873762 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:21:24 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:21:24 np0005540741 podman[133437]: 2025-12-01 09:21:24.718625928 +0000 UTC m=+0.164510150 container init ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_roentgen, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True)
Dec  1 04:21:24 np0005540741 podman[133437]: 2025-12-01 09:21:24.727784131 +0000 UTC m=+0.173668313 container start ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_roentgen, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True)
Dec  1 04:21:24 np0005540741 podman[133437]: 2025-12-01 09:21:24.731642172 +0000 UTC m=+0.177526374 container attach ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_roentgen, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:21:24 np0005540741 cranky_roentgen[133474]: 167 167
Dec  1 04:21:24 np0005540741 systemd[1]: libpod-ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a.scope: Deactivated successfully.
Dec  1 04:21:24 np0005540741 podman[133437]: 2025-12-01 09:21:24.736462291 +0000 UTC m=+0.182346483 container died ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_roentgen, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:21:24 np0005540741 systemd[1]: var-lib-containers-storage-overlay-a959c51f468b6cc24e657b8bef059940c507937419c775c1b1abe47e8998657a-merged.mount: Deactivated successfully.
Dec  1 04:21:24 np0005540741 podman[133437]: 2025-12-01 09:21:24.775668438 +0000 UTC m=+0.221552630 container remove ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_roentgen, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 04:21:24 np0005540741 systemd[1]: libpod-conmon-ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a.scope: Deactivated successfully.
Dec  1 04:21:24 np0005540741 podman[133570]: 2025-12-01 09:21:24.965890336 +0000 UTC m=+0.058124312 container create 3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_chatelet, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:21:25 np0005540741 systemd[1]: Started libpod-conmon-3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535.scope.
Dec  1 04:21:25 np0005540741 podman[133570]: 2025-12-01 09:21:24.943331678 +0000 UTC m=+0.035565774 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:21:25 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:21:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e3db95974c039bd2c40ab7e8a4ef0c752ae71d80ffad7028216208056fa09ee/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:21:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e3db95974c039bd2c40ab7e8a4ef0c752ae71d80ffad7028216208056fa09ee/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:21:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e3db95974c039bd2c40ab7e8a4ef0c752ae71d80ffad7028216208056fa09ee/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:21:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e3db95974c039bd2c40ab7e8a4ef0c752ae71d80ffad7028216208056fa09ee/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:21:25 np0005540741 podman[133570]: 2025-12-01 09:21:25.094361429 +0000 UTC m=+0.186595425 container init 3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_chatelet, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 04:21:25 np0005540741 podman[133570]: 2025-12-01 09:21:25.102234636 +0000 UTC m=+0.194468612 container start 3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_chatelet, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:21:25 np0005540741 podman[133570]: 2025-12-01 09:21:25.105818539 +0000 UTC m=+0.198052515 container attach 3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_chatelet, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:21:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v284: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:25 np0005540741 python3.9[133614]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580884.133848-124-121422586634669/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=a4bf943053f807dd7cb30711eb355c9616e00332 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:25 np0005540741 python3.9[133771]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]: {
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:        "osd_id": 0,
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:        "type": "bluestore"
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:    },
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:        "osd_id": 1,
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:        "type": "bluestore"
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:    },
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:        "osd_id": 2,
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:        "type": "bluestore"
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]:    }
Dec  1 04:21:26 np0005540741 pedantic_chatelet[133615]: }
Dec  1 04:21:26 np0005540741 systemd[1]: libpod-3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535.scope: Deactivated successfully.
Dec  1 04:21:26 np0005540741 systemd[1]: libpod-3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535.scope: Consumed 1.180s CPU time.
Dec  1 04:21:26 np0005540741 podman[133570]: 2025-12-01 09:21:26.275161075 +0000 UTC m=+1.367395051 container died 3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_chatelet, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:21:26 np0005540741 python3.9[133933]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580885.4008207-124-98226151290056/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=c8d7c531c4af3660f989db6faf3dfcfdf6ae5115 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:26 np0005540741 systemd[1]: var-lib-containers-storage-overlay-4e3db95974c039bd2c40ab7e8a4ef0c752ae71d80ffad7028216208056fa09ee-merged.mount: Deactivated successfully.
Dec  1 04:21:26 np0005540741 podman[133570]: 2025-12-01 09:21:26.629549092 +0000 UTC m=+1.721783068 container remove 3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_chatelet, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:21:26 np0005540741 systemd[1]: libpod-conmon-3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535.scope: Deactivated successfully.
Dec  1 04:21:26 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:21:26 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:21:26 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:21:26 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:21:26 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev bd1567b8-0051-4df2-93ec-24ff509f4730 does not exist
Dec  1 04:21:26 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:21:26 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:21:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v285: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:27 np0005540741 python3.9[134136]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:21:28 np0005540741 python3.9[134288]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:21:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:21:29 np0005540741 python3.9[134440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v286: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:29 np0005540741 python3.9[134563]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580888.3895655-183-96940074566238/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=00b451d2ac8687242d7356b231cb87a2ffd182d2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:30 np0005540741 python3.9[134715]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:31 np0005540741 python3.9[134838]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580889.9793203-183-238090447328546/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=a4bf943053f807dd7cb30711eb355c9616e00332 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v287: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:31 np0005540741 python3.9[134990]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:32 np0005540741 python3.9[135113]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580891.3438542-183-212867816627443/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=874ab53010d2695f4bed0a375bf3dde853ac0b72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v288: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:21:33 np0005540741 python3.9[135265]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:21:34 np0005540741 python3.9[135417]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v289: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:35 np0005540741 python3.9[135540]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580894.1370416-251-227881741510994/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=30f43b3b193e6fb640de0bf588bd9062982dce0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:36 np0005540741 python3.9[135692]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:21:36 np0005540741 python3.9[135844]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v290: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:37 np0005540741 python3.9[135967]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580896.3355622-275-225724613973898/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=30f43b3b193e6fb640de0bf588bd9062982dce0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:38 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:21:38 np0005540741 python3.9[136119]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:21:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v291: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:39 np0005540741 python3.9[136271]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:40 np0005540741 python3.9[136394]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580898.892895-299-176142090972686/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=30f43b3b193e6fb640de0bf588bd9062982dce0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:40 np0005540741 python3.9[136546]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:21:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v292: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:41 np0005540741 python3.9[136698]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:42 np0005540741 python3.9[136821]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580901.044957-323-263294009601000/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=30f43b3b193e6fb640de0bf588bd9062982dce0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:21:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:21:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:21:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:21:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:21:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:21:43 np0005540741 python3.9[136973]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:21:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v293: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:21:43 np0005540741 python3.9[137125]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:44 np0005540741 python3.9[137248]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580903.3048897-347-153379687237424/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=30f43b3b193e6fb640de0bf588bd9062982dce0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v294: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:45 np0005540741 python3.9[137400]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:21:46 np0005540741 python3.9[137552]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:46 np0005540741 python3.9[137675]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580905.4577227-371-173330415019643/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=30f43b3b193e6fb640de0bf588bd9062982dce0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:47 np0005540741 systemd[1]: session-44.scope: Deactivated successfully.
Dec  1 04:21:47 np0005540741 systemd[1]: session-44.scope: Consumed 27.010s CPU time.
Dec  1 04:21:47 np0005540741 systemd-logind[788]: Session 44 logged out. Waiting for processes to exit.
Dec  1 04:21:47 np0005540741 systemd-logind[788]: Removed session 44.
Dec  1 04:21:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v295: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:21:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v296: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v297: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:52 np0005540741 systemd-logind[788]: New session 45 of user zuul.
Dec  1 04:21:52 np0005540741 systemd[1]: Started Session 45 of User zuul.
Dec  1 04:21:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v298: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:53 np0005540741 python3.9[137855]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:21:54 np0005540741 python3.9[138007]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:54 np0005540741 python3.9[138130]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580913.446532-34-232238526912525/.source.conf _original_basename=ceph.conf follow=False checksum=2bbdee6ce99be2e18e11631e7462d3c1fd9af211 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v299: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:55 np0005540741 python3.9[138282]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:21:56 np0005540741 python3.9[138405]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580915.0227323-34-24352050980876/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=30c595aa84bea916cfc9cc906a8788f27659122a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:56 np0005540741 systemd[1]: session-45.scope: Deactivated successfully.
Dec  1 04:21:56 np0005540741 systemd[1]: session-45.scope: Consumed 2.859s CPU time.
Dec  1 04:21:56 np0005540741 systemd-logind[788]: Session 45 logged out. Waiting for processes to exit.
Dec  1 04:21:56 np0005540741 systemd-logind[788]: Removed session 45.
Dec  1 04:21:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v300: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:21:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:21:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v301: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v302: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v303: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:03 np0005540741 systemd-logind[788]: New session 46 of user zuul.
Dec  1 04:22:03 np0005540741 systemd[1]: Started Session 46 of User zuul.
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #21. Immutable memtables: 0.
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.530142) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 21
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580923530378, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 768, "num_deletes": 250, "total_data_size": 675094, "memory_usage": 689840, "flush_reason": "Manual Compaction"}
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #22: started
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580923541330, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 22, "file_size": 434410, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6690, "largest_seqno": 7457, "table_properties": {"data_size": 431164, "index_size": 1090, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8215, "raw_average_key_size": 19, "raw_value_size": 424355, "raw_average_value_size": 1005, "num_data_blocks": 51, "num_entries": 422, "num_filter_entries": 422, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580855, "oldest_key_time": 1764580855, "file_creation_time": 1764580923, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 22, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 11148 microseconds, and 3741 cpu microseconds.
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.541390) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #22: 434410 bytes OK
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.541414) [db/memtable_list.cc:519] [default] Level-0 commit table #22 started
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.542960) [db/memtable_list.cc:722] [default] Level-0 commit table #22: memtable #1 done
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.542977) EVENT_LOG_v1 {"time_micros": 1764580923542970, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.543000) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 671220, prev total WAL file size 671220, number of live WAL files 2.
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000018.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.543572) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [22(424KB)], [20(5217KB)]
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580923543656, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [22], "files_L6": [20], "score": -1, "input_data_size": 5777614, "oldest_snapshot_seqno": -1}
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #23: 2571 keys, 4204786 bytes, temperature: kUnknown
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580923582006, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 23, "file_size": 4204786, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 4186103, "index_size": 11150, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 6469, "raw_key_size": 59820, "raw_average_key_size": 23, "raw_value_size": 4138447, "raw_average_value_size": 1609, "num_data_blocks": 507, "num_entries": 2571, "num_filter_entries": 2571, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764580923, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.582395) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 4204786 bytes
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.584282) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 150.2 rd, 109.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 5.1 +0.0 blob) out(4.0 +0.0 blob), read-write-amplify(23.0) write-amplify(9.7) OK, records in: 3055, records dropped: 484 output_compression: NoCompression
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.584337) EVENT_LOG_v1 {"time_micros": 1764580923584320, "job": 6, "event": "compaction_finished", "compaction_time_micros": 38463, "compaction_time_cpu_micros": 18178, "output_level": 6, "num_output_files": 1, "total_output_size": 4204786, "num_input_records": 3055, "num_output_records": 2571, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000022.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580923584589, "job": 6, "event": "table_file_deletion", "file_number": 22}
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580923585942, "job": 6, "event": "table_file_deletion", "file_number": 20}
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.543451) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.586127) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.586137) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.586140) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.586143) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:22:03 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.586147) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:22:04 np0005540741 python3.9[138583]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:22:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v304: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:05 np0005540741 python3.9[138739]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:22:06 np0005540741 python3.9[138891]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:22:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v305: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:07 np0005540741 python3.9[139041]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:22:08 np0005540741 python3.9[139193]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  1 04:22:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:22:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v306: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:10 np0005540741 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec  1 04:22:10 np0005540741 python3.9[139349]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:22:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v307: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:11 np0005540741 python3.9[139433]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:22:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:22:12
Dec  1 04:22:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:22:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:22:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['backups', '.mgr', 'images', 'cephfs.cephfs.data', 'vms', 'cephfs.cephfs.meta', 'volumes']
Dec  1 04:22:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:22:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v308: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:22:13 np0005540741 python3.9[139586]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 04:22:15 np0005540741 python3[139741]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec  1 04:22:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v309: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:16 np0005540741 python3.9[139893]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:16 np0005540741 python3.9[140045]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v310: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:17 np0005540741 python3.9[140123]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:18 np0005540741 python3.9[140275]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:22:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:22:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:22:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:22:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:22:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:22:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:22:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:22:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:22:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:22:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:22:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:22:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:22:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:22:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:22:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:22:18 np0005540741 python3.9[140353]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.r7zapobs recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v311: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:19 np0005540741 python3.9[140505]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:20 np0005540741 python3.9[140583]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:21 np0005540741 python3.9[140735]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:22:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v312: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:22 np0005540741 python3[140888]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  1 04:22:22 np0005540741 python3.9[141040]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:22 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:22:22 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 1726 writes, 7468 keys, 1726 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.01 MB/s#012Cumulative WAL: 1726 writes, 1726 syncs, 1.00 writes per sync, written: 0.01 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1726 writes, 7468 keys, 1726 commit groups, 1.0 writes per commit group, ingest: 7.48 MB, 0.01 MB/s#012Interval WAL: 1726 writes, 1726 syncs, 1.00 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    105.0      0.05              0.02         3    0.018       0      0       0.0       0.0#012  L6      1/0    4.01 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.6    119.7    102.4      0.09              0.05         2    0.044    5977    773       0.0       0.0#012 Sum      1/0    4.01 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.6     75.1    103.4      0.14              0.07         5    0.028    5977    773       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.7     76.5    104.9      0.14              0.07         4    0.035    5977    773       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0    119.7    102.4      0.09              0.05         2    0.044    5977    773       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    109.3      0.05              0.02         2    0.025       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     19.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.005, interval 0.005#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.01 GB write, 0.02 MB/s write, 0.01 GB read, 0.02 MB/s read, 0.1 seconds#012Interval compaction: 0.01 GB write, 0.02 MB/s write, 0.01 GB read, 0.02 MB/s read, 0.1 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bbd56b51f0#2 capacity: 308.00 MB usage: 560.92 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 5.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(36,492.47 KB,0.156145%) FilterBlock(6,22.67 KB,0.00718847%) IndexBlock(6,45.78 KB,0.0145157%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  1 04:22:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v313: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:22:23 np0005540741 python3.9[141165]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580942.225214-157-231396235722568/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:24 np0005540741 python3.9[141317]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:24 np0005540741 python3.9[141442]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580943.7514427-172-165017964126906/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v314: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:25 np0005540741 python3.9[141594]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:26 np0005540741 python3.9[141719]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580945.1733842-187-192928384059708/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:27 np0005540741 python3.9[141871]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v315: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:22:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:22:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:22:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:22:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:22:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:22:27 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 8821ad64-a99e-42f4-82c4-2c1991d750fe does not exist
Dec  1 04:22:27 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 4331d7c3-f957-4659-b3c6-1932c1608ea8 does not exist
Dec  1 04:22:27 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 2fbe7e33-cf3d-46b4-9c8e-efa2c206b754 does not exist
Dec  1 04:22:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:22:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:22:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:22:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:22:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:22:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:22:27 np0005540741 python3.9[142113]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580946.4221358-202-42183785616875/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:28 np0005540741 podman[142381]: 2025-12-01 09:22:28.31688917 +0000 UTC m=+0.049867447 container create c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_lovelace, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:22:28 np0005540741 systemd[1]: Started libpod-conmon-c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31.scope.
Dec  1 04:22:28 np0005540741 podman[142381]: 2025-12-01 09:22:28.29611235 +0000 UTC m=+0.029090657 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:22:28 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:22:28 np0005540741 podman[142381]: 2025-12-01 09:22:28.423041965 +0000 UTC m=+0.156020252 container init c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_lovelace, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True)
Dec  1 04:22:28 np0005540741 podman[142381]: 2025-12-01 09:22:28.433586555 +0000 UTC m=+0.166564852 container start c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_lovelace, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:22:28 np0005540741 podman[142381]: 2025-12-01 09:22:28.437530867 +0000 UTC m=+0.170509164 container attach c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_lovelace, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  1 04:22:28 np0005540741 laughing_lovelace[142432]: 167 167
Dec  1 04:22:28 np0005540741 systemd[1]: libpod-c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31.scope: Deactivated successfully.
Dec  1 04:22:28 np0005540741 podman[142381]: 2025-12-01 09:22:28.444554906 +0000 UTC m=+0.177533183 container died c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_lovelace, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 04:22:28 np0005540741 systemd[1]: var-lib-containers-storage-overlay-9a3a712a3db0981c066aef77fef4972d490272ac4d65e20de91f76144bdcd027-merged.mount: Deactivated successfully.
Dec  1 04:22:28 np0005540741 podman[142381]: 2025-12-01 09:22:28.499173478 +0000 UTC m=+0.232151745 container remove c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_lovelace, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec  1 04:22:28 np0005540741 systemd[1]: libpod-conmon-c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31.scope: Deactivated successfully.
Dec  1 04:22:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:22:28 np0005540741 python3.9[142434]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:22:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:22:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:22:28 np0005540741 podman[142459]: 2025-12-01 09:22:28.673455617 +0000 UTC m=+0.044112873 container create 6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kilby, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:22:28 np0005540741 systemd[1]: Started libpod-conmon-6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19.scope.
Dec  1 04:22:28 np0005540741 podman[142459]: 2025-12-01 09:22:28.651464233 +0000 UTC m=+0.022121489 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:22:28 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:22:28 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02e9a3fda8131a5b1317ed5055fb208c60e7b084e6d674b873db6747dd81020f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:22:28 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02e9a3fda8131a5b1317ed5055fb208c60e7b084e6d674b873db6747dd81020f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:22:28 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02e9a3fda8131a5b1317ed5055fb208c60e7b084e6d674b873db6747dd81020f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:22:28 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02e9a3fda8131a5b1317ed5055fb208c60e7b084e6d674b873db6747dd81020f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:22:28 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02e9a3fda8131a5b1317ed5055fb208c60e7b084e6d674b873db6747dd81020f/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:22:28 np0005540741 podman[142459]: 2025-12-01 09:22:28.785188491 +0000 UTC m=+0.155845767 container init 6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kilby, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec  1 04:22:28 np0005540741 podman[142459]: 2025-12-01 09:22:28.798701995 +0000 UTC m=+0.169359231 container start 6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kilby, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  1 04:22:28 np0005540741 podman[142459]: 2025-12-01 09:22:28.802154723 +0000 UTC m=+0.172811979 container attach 6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kilby, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec  1 04:22:29 np0005540741 python3.9[142603]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580947.9659944-217-195359211251029/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v316: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:29 np0005540741 affectionate_kilby[142506]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:22:29 np0005540741 affectionate_kilby[142506]: --> relative data size: 1.0
Dec  1 04:22:29 np0005540741 affectionate_kilby[142506]: --> All data devices are unavailable
Dec  1 04:22:29 np0005540741 python3.9[142769]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:29 np0005540741 systemd[1]: libpod-6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19.scope: Deactivated successfully.
Dec  1 04:22:29 np0005540741 systemd[1]: libpod-6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19.scope: Consumed 1.034s CPU time.
Dec  1 04:22:29 np0005540741 podman[142459]: 2025-12-01 09:22:29.893317553 +0000 UTC m=+1.263974779 container died 6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kilby, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  1 04:22:29 np0005540741 systemd[1]: var-lib-containers-storage-overlay-02e9a3fda8131a5b1317ed5055fb208c60e7b084e6d674b873db6747dd81020f-merged.mount: Deactivated successfully.
Dec  1 04:22:29 np0005540741 podman[142459]: 2025-12-01 09:22:29.952217026 +0000 UTC m=+1.322874252 container remove 6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kilby, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec  1 04:22:29 np0005540741 systemd[1]: libpod-conmon-6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19.scope: Deactivated successfully.
Dec  1 04:22:30 np0005540741 podman[143083]: 2025-12-01 09:22:30.679323427 +0000 UTC m=+0.064914485 container create 661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_moore, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:22:30 np0005540741 systemd[1]: Started libpod-conmon-661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509.scope.
Dec  1 04:22:30 np0005540741 python3.9[143080]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:22:30 np0005540741 podman[143083]: 2025-12-01 09:22:30.658274619 +0000 UTC m=+0.043865737 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:22:30 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:22:30 np0005540741 podman[143083]: 2025-12-01 09:22:30.769438666 +0000 UTC m=+0.155029754 container init 661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_moore, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:22:30 np0005540741 podman[143083]: 2025-12-01 09:22:30.779729068 +0000 UTC m=+0.165320136 container start 661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_moore, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:22:30 np0005540741 podman[143083]: 2025-12-01 09:22:30.783752733 +0000 UTC m=+0.169343821 container attach 661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_moore, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef)
Dec  1 04:22:30 np0005540741 elegant_moore[143100]: 167 167
Dec  1 04:22:30 np0005540741 systemd[1]: libpod-661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509.scope: Deactivated successfully.
Dec  1 04:22:30 np0005540741 podman[143083]: 2025-12-01 09:22:30.787435207 +0000 UTC m=+0.173026295 container died 661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_moore, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:22:30 np0005540741 systemd[1]: var-lib-containers-storage-overlay-4fbed17bb650c4d9d10a134138c56c7a4acadfc9db762c6829ca47d53daeec1f-merged.mount: Deactivated successfully.
Dec  1 04:22:30 np0005540741 podman[143083]: 2025-12-01 09:22:30.82662144 +0000 UTC m=+0.212212518 container remove 661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_moore, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec  1 04:22:30 np0005540741 systemd[1]: libpod-conmon-661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509.scope: Deactivated successfully.
Dec  1 04:22:31 np0005540741 podman[143151]: 2025-12-01 09:22:31.004329927 +0000 UTC m=+0.066422707 container create e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_dubinsky, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec  1 04:22:31 np0005540741 podman[143151]: 2025-12-01 09:22:30.971528896 +0000 UTC m=+0.033621726 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:22:31 np0005540741 systemd[1]: Started libpod-conmon-e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd.scope.
Dec  1 04:22:31 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:22:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/564f25c2f1fc0267c9782b035d09de309bccbae5aa761b887e549893ddca620a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:22:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/564f25c2f1fc0267c9782b035d09de309bccbae5aa761b887e549893ddca620a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:22:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/564f25c2f1fc0267c9782b035d09de309bccbae5aa761b887e549893ddca620a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:22:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/564f25c2f1fc0267c9782b035d09de309bccbae5aa761b887e549893ddca620a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:22:31 np0005540741 podman[143151]: 2025-12-01 09:22:31.121264939 +0000 UTC m=+0.183357739 container init e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_dubinsky, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:22:31 np0005540741 podman[143151]: 2025-12-01 09:22:31.133386253 +0000 UTC m=+0.195479033 container start e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_dubinsky, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:22:31 np0005540741 podman[143151]: 2025-12-01 09:22:31.138349234 +0000 UTC m=+0.200442014 container attach e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_dubinsky, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:22:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v317: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:31 np0005540741 python3.9[143300]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]: {
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:    "0": [
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:        {
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "devices": [
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "/dev/loop3"
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            ],
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "lv_name": "ceph_lv0",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "lv_size": "21470642176",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "name": "ceph_lv0",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "tags": {
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.cluster_name": "ceph",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.crush_device_class": "",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.encrypted": "0",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.osd_id": "0",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.type": "block",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.vdo": "0"
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            },
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "type": "block",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "vg_name": "ceph_vg0"
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:        }
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:    ],
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:    "1": [
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:        {
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "devices": [
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "/dev/loop4"
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            ],
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "lv_name": "ceph_lv1",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "lv_size": "21470642176",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "name": "ceph_lv1",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "tags": {
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.cluster_name": "ceph",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.crush_device_class": "",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.encrypted": "0",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.osd_id": "1",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.type": "block",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.vdo": "0"
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            },
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "type": "block",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "vg_name": "ceph_vg1"
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:        }
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:    ],
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:    "2": [
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:        {
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "devices": [
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "/dev/loop5"
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            ],
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "lv_name": "ceph_lv2",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "lv_size": "21470642176",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "name": "ceph_lv2",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "tags": {
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.cluster_name": "ceph",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.crush_device_class": "",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.encrypted": "0",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.osd_id": "2",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.type": "block",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:                "ceph.vdo": "0"
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            },
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "type": "block",
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:            "vg_name": "ceph_vg2"
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:        }
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]:    ]
Dec  1 04:22:31 np0005540741 lucid_dubinsky[143217]: }
Dec  1 04:22:32 np0005540741 systemd[1]: libpod-e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd.scope: Deactivated successfully.
Dec  1 04:22:32 np0005540741 podman[143151]: 2025-12-01 09:22:32.017223004 +0000 UTC m=+1.079315784 container died e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_dubinsky, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec  1 04:22:32 np0005540741 systemd[1]: var-lib-containers-storage-overlay-564f25c2f1fc0267c9782b035d09de309bccbae5aa761b887e549893ddca620a-merged.mount: Deactivated successfully.
Dec  1 04:22:32 np0005540741 podman[143151]: 2025-12-01 09:22:32.080563843 +0000 UTC m=+1.142656593 container remove e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_dubinsky, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec  1 04:22:32 np0005540741 systemd[1]: libpod-conmon-e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd.scope: Deactivated successfully.
Dec  1 04:22:32 np0005540741 python3.9[143519]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:22:32 np0005540741 podman[143689]: 2025-12-01 09:22:32.822546376 +0000 UTC m=+0.047363936 container create 352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_benz, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Dec  1 04:22:32 np0005540741 systemd[1]: Started libpod-conmon-352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0.scope.
Dec  1 04:22:32 np0005540741 podman[143689]: 2025-12-01 09:22:32.799886823 +0000 UTC m=+0.024704423 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:22:32 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:22:32 np0005540741 podman[143689]: 2025-12-01 09:22:32.919372816 +0000 UTC m=+0.144190396 container init 352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_benz, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:22:32 np0005540741 podman[143689]: 2025-12-01 09:22:32.933148697 +0000 UTC m=+0.157966257 container start 352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_benz, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:22:32 np0005540741 podman[143689]: 2025-12-01 09:22:32.938579812 +0000 UTC m=+0.163397382 container attach 352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_benz, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:22:32 np0005540741 sweet_benz[143732]: 167 167
Dec  1 04:22:32 np0005540741 systemd[1]: libpod-352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0.scope: Deactivated successfully.
Dec  1 04:22:32 np0005540741 podman[143689]: 2025-12-01 09:22:32.940816105 +0000 UTC m=+0.165633685 container died 352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_benz, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:22:32 np0005540741 systemd[1]: var-lib-containers-storage-overlay-772545479a02b333d6b8575f6da5bd9187e04118edffeba56fe572eac2b9a9da-merged.mount: Deactivated successfully.
Dec  1 04:22:32 np0005540741 podman[143689]: 2025-12-01 09:22:32.983988471 +0000 UTC m=+0.208806021 container remove 352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_benz, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:22:32 np0005540741 systemd[1]: libpod-conmon-352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0.scope: Deactivated successfully.
Dec  1 04:22:33 np0005540741 podman[143803]: 2025-12-01 09:22:33.168284546 +0000 UTC m=+0.048028656 container create dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_gates, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:22:33 np0005540741 python3.9[143797]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:22:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v318: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:33 np0005540741 systemd[1]: Started libpod-conmon-dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4.scope.
Dec  1 04:22:33 np0005540741 podman[143803]: 2025-12-01 09:22:33.149768 +0000 UTC m=+0.029512130 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:22:33 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:22:33 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46b9484028f687832b784273f9e00efbe969116e8d436843163a6ea681366768/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:22:33 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46b9484028f687832b784273f9e00efbe969116e8d436843163a6ea681366768/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:22:33 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46b9484028f687832b784273f9e00efbe969116e8d436843163a6ea681366768/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:22:33 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46b9484028f687832b784273f9e00efbe969116e8d436843163a6ea681366768/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:22:33 np0005540741 podman[143803]: 2025-12-01 09:22:33.310144485 +0000 UTC m=+0.189888655 container init dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_gates, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:22:33 np0005540741 podman[143803]: 2025-12-01 09:22:33.317179714 +0000 UTC m=+0.196923834 container start dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_gates, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Dec  1 04:22:33 np0005540741 podman[143803]: 2025-12-01 09:22:33.320454887 +0000 UTC m=+0.200199037 container attach dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_gates, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:22:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:22:33 np0005540741 python3.9[143977]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]: {
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:        "osd_id": 0,
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:        "type": "bluestore"
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:    },
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:        "osd_id": 1,
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:        "type": "bluestore"
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:    },
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:        "osd_id": 2,
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:        "type": "bluestore"
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]:    }
Dec  1 04:22:34 np0005540741 wonderful_gates[143821]: }
Dec  1 04:22:34 np0005540741 systemd[1]: libpod-dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4.scope: Deactivated successfully.
Dec  1 04:22:34 np0005540741 systemd[1]: libpod-dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4.scope: Consumed 1.019s CPU time.
Dec  1 04:22:34 np0005540741 podman[143803]: 2025-12-01 09:22:34.334825327 +0000 UTC m=+1.214569517 container died dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_gates, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec  1 04:22:34 np0005540741 systemd[1]: var-lib-containers-storage-overlay-46b9484028f687832b784273f9e00efbe969116e8d436843163a6ea681366768-merged.mount: Deactivated successfully.
Dec  1 04:22:34 np0005540741 podman[143803]: 2025-12-01 09:22:34.402277063 +0000 UTC m=+1.282021173 container remove dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_gates, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Dec  1 04:22:34 np0005540741 systemd[1]: libpod-conmon-dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4.scope: Deactivated successfully.
Dec  1 04:22:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:22:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:22:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:22:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:22:34 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev fa9d40c6-08b8-4b3e-933d-60633a99e5ab does not exist
Dec  1 04:22:34 np0005540741 python3.9[144174]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:34 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:22:34 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:22:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v319: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:35 np0005540741 python3.9[144374]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:22:37 np0005540741 python3.9[144527]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:c6:22:5a:f7" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:22:37 np0005540741 ovs-vsctl[144528]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:c6:22:5a:f7 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec  1 04:22:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v320: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:37 np0005540741 python3.9[144680]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:22:38 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:22:38 np0005540741 python3.9[144835]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:22:38 np0005540741 ovs-vsctl[144836]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec  1 04:22:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v321: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:39 np0005540741 python3.9[144986]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:22:40 np0005540741 python3.9[145140]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:22:41 np0005540741 python3.9[145292]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v322: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:41 np0005540741 python3.9[145370]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:22:42 np0005540741 python3.9[145522]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:42 np0005540741 python3.9[145600]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:22:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:22:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:22:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:22:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:22:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:22:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:22:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v323: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:43 np0005540741 python3.9[145752]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:22:44 np0005540741 python3.9[145904]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:44 np0005540741 python3.9[145982]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v324: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:45 np0005540741 python3.9[146134]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:45 np0005540741 python3.9[146212]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:46 np0005540741 python3.9[146364]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:22:46 np0005540741 systemd[1]: Reloading.
Dec  1 04:22:46 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:22:46 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:22:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v325: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:47 np0005540741 python3.9[146553]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:48 np0005540741 python3.9[146631]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:22:48 np0005540741 python3.9[146783]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v326: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:49 np0005540741 python3.9[146861]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:50 np0005540741 python3.9[147013]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:22:50 np0005540741 systemd[1]: Reloading.
Dec  1 04:22:50 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:22:50 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:22:50 np0005540741 systemd[1]: Starting Create netns directory...
Dec  1 04:22:50 np0005540741 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  1 04:22:50 np0005540741 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  1 04:22:50 np0005540741 systemd[1]: Finished Create netns directory.
Dec  1 04:22:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v327: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:51 np0005540741 python3.9[147206]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:22:52 np0005540741 python3.9[147358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:52 np0005540741 python3.9[147481]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764580971.641556-468-223178151653433/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:22:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v328: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:22:53 np0005540741 python3.9[147633]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:22:54 np0005540741 python3.9[147785]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:22:54 np0005540741 python3.9[147908]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580973.8409085-493-19042733144268/.source.json _original_basename=.debfnmn8 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v329: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:55 np0005540741 python3.9[148060]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:22:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v330: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:57 np0005540741 python3.9[148487]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec  1 04:22:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:22:58 np0005540741 python3.9[148639]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  1 04:22:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v331: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:22:59 np0005540741 python3.9[148791]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  1 04:23:01 np0005540741 python3[148970]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  1 04:23:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v332: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v333: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:23:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v334: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:07 np0005540741 podman[148982]: 2025-12-01 09:23:07.003057856 +0000 UTC m=+5.876582464 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  1 04:23:07 np0005540741 podman[149099]: 2025-12-01 09:23:07.207889686 +0000 UTC m=+0.075439288 container create 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:23:07 np0005540741 podman[149099]: 2025-12-01 09:23:07.174451661 +0000 UTC m=+0.042001273 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  1 04:23:07 np0005540741 python3[148970]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  1 04:23:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v335: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:08 np0005540741 python3.9[149288]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:23:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:23:08 np0005540741 python3.9[149442]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:23:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v336: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:09 np0005540741 python3.9[149518]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:23:10 np0005540741 python3.9[149669]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580989.3717155-581-123960181093906/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:23:10 np0005540741 python3.9[149745]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 04:23:10 np0005540741 systemd[1]: Reloading.
Dec  1 04:23:10 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:23:10 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:23:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v337: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:11 np0005540741 python3.9[149857]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:23:11 np0005540741 systemd[1]: Reloading.
Dec  1 04:23:11 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:23:11 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:23:12 np0005540741 systemd[1]: Starting ovn_controller container...
Dec  1 04:23:12 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:23:12 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b77894e7671e5f816c1686647ee4a2e892983fd7b53971baf083a85c34a46778/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec  1 04:23:12 np0005540741 systemd[1]: Started /usr/bin/podman healthcheck run 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c.
Dec  1 04:23:12 np0005540741 podman[149898]: 2025-12-01 09:23:12.208668003 +0000 UTC m=+0.131907578 container init 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: + sudo -E kolla_set_configs
Dec  1 04:23:12 np0005540741 podman[149898]: 2025-12-01 09:23:12.232863692 +0000 UTC m=+0.156103247 container start 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 04:23:12 np0005540741 edpm-start-podman-container[149898]: ovn_controller
Dec  1 04:23:12 np0005540741 systemd[1]: Created slice User Slice of UID 0.
Dec  1 04:23:12 np0005540741 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec  1 04:23:12 np0005540741 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec  1 04:23:12 np0005540741 edpm-start-podman-container[149897]: Creating additional drop-in dependency for "ovn_controller" (34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c)
Dec  1 04:23:12 np0005540741 systemd[1]: Starting User Manager for UID 0...
Dec  1 04:23:12 np0005540741 podman[149921]: 2025-12-01 09:23:12.312076528 +0000 UTC m=+0.066013856 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec  1 04:23:12 np0005540741 systemd[1]: 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c-968c33bf7dddd62.service: Main process exited, code=exited, status=1/FAILURE
Dec  1 04:23:12 np0005540741 systemd[1]: 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c-968c33bf7dddd62.service: Failed with result 'exit-code'.
Dec  1 04:23:12 np0005540741 systemd[1]: Reloading.
Dec  1 04:23:12 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:23:12 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:23:12 np0005540741 systemd[149955]: Queued start job for default target Main User Target.
Dec  1 04:23:12 np0005540741 systemd[149955]: Created slice User Application Slice.
Dec  1 04:23:12 np0005540741 systemd[149955]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec  1 04:23:12 np0005540741 systemd[149955]: Started Daily Cleanup of User's Temporary Directories.
Dec  1 04:23:12 np0005540741 systemd[149955]: Reached target Paths.
Dec  1 04:23:12 np0005540741 systemd[149955]: Reached target Timers.
Dec  1 04:23:12 np0005540741 systemd[149955]: Starting D-Bus User Message Bus Socket...
Dec  1 04:23:12 np0005540741 systemd[149955]: Starting Create User's Volatile Files and Directories...
Dec  1 04:23:12 np0005540741 systemd[149955]: Listening on D-Bus User Message Bus Socket.
Dec  1 04:23:12 np0005540741 systemd[149955]: Reached target Sockets.
Dec  1 04:23:12 np0005540741 systemd[149955]: Finished Create User's Volatile Files and Directories.
Dec  1 04:23:12 np0005540741 systemd[149955]: Reached target Basic System.
Dec  1 04:23:12 np0005540741 systemd[149955]: Reached target Main User Target.
Dec  1 04:23:12 np0005540741 systemd[149955]: Startup finished in 155ms.
Dec  1 04:23:12 np0005540741 systemd[1]: Started User Manager for UID 0.
Dec  1 04:23:12 np0005540741 systemd[1]: Started Session c1 of User root.
Dec  1 04:23:12 np0005540741 systemd[1]: Started ovn_controller container.
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: INFO:__main__:Validating config file
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: INFO:__main__:Writing out command to execute
Dec  1 04:23:12 np0005540741 systemd[1]: session-c1.scope: Deactivated successfully.
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: ++ cat /run_command
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: + ARGS=
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: + sudo kolla_copy_cacerts
Dec  1 04:23:12 np0005540741 systemd[1]: Started Session c2 of User root.
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: + [[ ! -n '' ]]
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: + . kolla_extend_start
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: + umask 0022
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec  1 04:23:12 np0005540741 systemd[1]: session-c2.scope: Deactivated successfully.
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec  1 04:23:12 np0005540741 NetworkManager[48954]: <info>  [1764580992.8059] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Dec  1 04:23:12 np0005540741 NetworkManager[48954]: <info>  [1764580992.8066] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:23:12 np0005540741 NetworkManager[48954]: <info>  [1764580992.8076] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec  1 04:23:12 np0005540741 NetworkManager[48954]: <info>  [1764580992.8080] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Dec  1 04:23:12 np0005540741 NetworkManager[48954]: <info>  [1764580992.8083] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec  1 04:23:12 np0005540741 kernel: br-int: entered promiscuous mode
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00014|main|INFO|OVS feature set changed, force recompute.
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00022|main|INFO|OVS feature set changed, force recompute.
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00001|statctrl(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00002|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00003|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  1 04:23:12 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:12Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  1 04:23:12 np0005540741 NetworkManager[48954]: <info>  [1764580992.8281] manager: (ovn-7cec17-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec  1 04:23:12 np0005540741 kernel: genev_sys_6081: entered promiscuous mode
Dec  1 04:23:12 np0005540741 NetworkManager[48954]: <info>  [1764580992.8487] device (genev_sys_6081): carrier: link connected
Dec  1 04:23:12 np0005540741 NetworkManager[48954]: <info>  [1764580992.8490] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Dec  1 04:23:12 np0005540741 systemd-udevd[150076]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:23:12 np0005540741 systemd-udevd[150072]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:23:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:23:12
Dec  1 04:23:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:23:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:23:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['volumes', 'cephfs.cephfs.data', 'vms', 'images', 'backups', 'cephfs.cephfs.meta', '.mgr']
Dec  1 04:23:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:23:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v338: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:13 np0005540741 python3.9[150183]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:23:13 np0005540741 ovs-vsctl[150184]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec  1 04:23:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:23:13 np0005540741 python3.9[150336]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:23:13 np0005540741 ovs-vsctl[150338]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec  1 04:23:14 np0005540741 python3.9[150491]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:23:14 np0005540741 ovs-vsctl[150492]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec  1 04:23:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v339: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:15 np0005540741 systemd[1]: session-46.scope: Deactivated successfully.
Dec  1 04:23:15 np0005540741 systemd[1]: session-46.scope: Consumed 1min 3.147s CPU time.
Dec  1 04:23:15 np0005540741 systemd-logind[788]: Session 46 logged out. Waiting for processes to exit.
Dec  1 04:23:15 np0005540741 systemd-logind[788]: Removed session 46.
Dec  1 04:23:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v340: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:23:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:23:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:23:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:23:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:23:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:23:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:23:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:23:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:23:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:23:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:23:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:23:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:23:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:23:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:23:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:23:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v341: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v342: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:21 np0005540741 systemd-logind[788]: New session 48 of user zuul.
Dec  1 04:23:21 np0005540741 systemd[1]: Started Session 48 of User zuul.
Dec  1 04:23:22 np0005540741 python3.9[150670]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:23:22 np0005540741 systemd[1]: Stopping User Manager for UID 0...
Dec  1 04:23:22 np0005540741 systemd[149955]: Activating special unit Exit the Session...
Dec  1 04:23:22 np0005540741 systemd[149955]: Stopped target Main User Target.
Dec  1 04:23:22 np0005540741 systemd[149955]: Stopped target Basic System.
Dec  1 04:23:22 np0005540741 systemd[149955]: Stopped target Paths.
Dec  1 04:23:22 np0005540741 systemd[149955]: Stopped target Sockets.
Dec  1 04:23:22 np0005540741 systemd[149955]: Stopped target Timers.
Dec  1 04:23:22 np0005540741 systemd[149955]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  1 04:23:22 np0005540741 systemd[149955]: Closed D-Bus User Message Bus Socket.
Dec  1 04:23:22 np0005540741 systemd[149955]: Stopped Create User's Volatile Files and Directories.
Dec  1 04:23:22 np0005540741 systemd[149955]: Removed slice User Application Slice.
Dec  1 04:23:22 np0005540741 systemd[149955]: Reached target Shutdown.
Dec  1 04:23:22 np0005540741 systemd[149955]: Finished Exit the Session.
Dec  1 04:23:22 np0005540741 systemd[149955]: Reached target Exit the Session.
Dec  1 04:23:22 np0005540741 systemd[1]: user@0.service: Deactivated successfully.
Dec  1 04:23:22 np0005540741 systemd[1]: Stopped User Manager for UID 0.
Dec  1 04:23:22 np0005540741 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec  1 04:23:22 np0005540741 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec  1 04:23:22 np0005540741 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec  1 04:23:22 np0005540741 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec  1 04:23:22 np0005540741 systemd[1]: Removed slice User Slice of UID 0.
Dec  1 04:23:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v343: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:23:23 np0005540741 python3.9[150829]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:24 np0005540741 python3.9[150981]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:24 np0005540741 python3.9[151133]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v344: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:25 np0005540741 python3.9[151285]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:26 np0005540741 python3.9[151437]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:26 np0005540741 python3.9[151587]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:23:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v345: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:27 np0005540741 python3.9[151739]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  1 04:23:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:23:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v346: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:29 np0005540741 python3.9[151889]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:23:30 np0005540741 python3.9[152010]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581008.7121596-86-104930840889940/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:30 np0005540741 python3.9[152160]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:23:31 np0005540741 python3.9[152283]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581010.2768936-101-206062927876463/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v347: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:32 np0005540741 python3.9[152435]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:23:33 np0005540741 python3.9[152519]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:23:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v348: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:23:35 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:23:35 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:23:35 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:23:35 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:23:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v349: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:35 np0005540741 python3.9[152894]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:23:36 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 40d06d98-796c-49fc-9fd2-27cd21485fc6 does not exist
Dec  1 04:23:36 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev f75698da-039f-4aab-baf1-77d84153a9e3 does not exist
Dec  1 04:23:36 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev beaa4b04-d67c-4497-addc-7d613d264aaa does not exist
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:23:36 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:23:36 np0005540741 python3.9[153187]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:23:36 np0005540741 podman[153214]: 2025-12-01 09:23:36.571694899 +0000 UTC m=+0.049794978 container create 64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_dewdney, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec  1 04:23:36 np0005540741 systemd[1]: Started libpod-conmon-64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52.scope.
Dec  1 04:23:36 np0005540741 podman[153214]: 2025-12-01 09:23:36.543592898 +0000 UTC m=+0.021693007 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:23:36 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:23:36 np0005540741 podman[153214]: 2025-12-01 09:23:36.662046577 +0000 UTC m=+0.140146686 container init 64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_dewdney, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:23:36 np0005540741 podman[153214]: 2025-12-01 09:23:36.670523932 +0000 UTC m=+0.148624021 container start 64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_dewdney, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:23:36 np0005540741 podman[153214]: 2025-12-01 09:23:36.673967131 +0000 UTC m=+0.152067240 container attach 64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_dewdney, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec  1 04:23:36 np0005540741 hardcore_dewdney[153249]: 167 167
Dec  1 04:23:36 np0005540741 systemd[1]: libpod-64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52.scope: Deactivated successfully.
Dec  1 04:23:36 np0005540741 podman[153214]: 2025-12-01 09:23:36.676492514 +0000 UTC m=+0.154592623 container died 64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_dewdney, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507)
Dec  1 04:23:36 np0005540741 systemd[1]: var-lib-containers-storage-overlay-5aa925afd2dc71c7e1e1d3345230026e390cc158cc2b95cab3f8701cefb78b08-merged.mount: Deactivated successfully.
Dec  1 04:23:36 np0005540741 podman[153214]: 2025-12-01 09:23:36.717141457 +0000 UTC m=+0.195241576 container remove 64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_dewdney, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec  1 04:23:36 np0005540741 systemd[1]: libpod-conmon-64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52.scope: Deactivated successfully.
Dec  1 04:23:36 np0005540741 podman[153359]: 2025-12-01 09:23:36.952784068 +0000 UTC m=+0.068242191 container create bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_khayyam, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec  1 04:23:37 np0005540741 systemd[1]: Started libpod-conmon-bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8.scope.
Dec  1 04:23:37 np0005540741 podman[153359]: 2025-12-01 09:23:36.9293072 +0000 UTC m=+0.044765353 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:23:37 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:23:37 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56716004b9a1a38bc7c592e47420fdb241617eb95197c22e990356941b598fd1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:23:37 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56716004b9a1a38bc7c592e47420fdb241617eb95197c22e990356941b598fd1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:23:37 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56716004b9a1a38bc7c592e47420fdb241617eb95197c22e990356941b598fd1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:23:37 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56716004b9a1a38bc7c592e47420fdb241617eb95197c22e990356941b598fd1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:23:37 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56716004b9a1a38bc7c592e47420fdb241617eb95197c22e990356941b598fd1/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:23:37 np0005540741 podman[153359]: 2025-12-01 09:23:37.078390303 +0000 UTC m=+0.193848426 container init bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_khayyam, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec  1 04:23:37 np0005540741 podman[153359]: 2025-12-01 09:23:37.091193253 +0000 UTC m=+0.206651366 container start bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_khayyam, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec  1 04:23:37 np0005540741 python3.9[153380]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581016.0948708-138-134806601772212/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:37 np0005540741 podman[153359]: 2025-12-01 09:23:37.105516666 +0000 UTC m=+0.220974799 container attach bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_khayyam, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec  1 04:23:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v350: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:37 np0005540741 python3.9[153545]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:23:38 np0005540741 modest_khayyam[153391]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:23:38 np0005540741 modest_khayyam[153391]: --> relative data size: 1.0
Dec  1 04:23:38 np0005540741 modest_khayyam[153391]: --> All data devices are unavailable
Dec  1 04:23:38 np0005540741 systemd[1]: libpod-bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8.scope: Deactivated successfully.
Dec  1 04:23:38 np0005540741 systemd[1]: libpod-bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8.scope: Consumed 1.039s CPU time.
Dec  1 04:23:38 np0005540741 podman[153359]: 2025-12-01 09:23:38.211879587 +0000 UTC m=+1.327337740 container died bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_khayyam, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:23:38 np0005540741 systemd[1]: var-lib-containers-storage-overlay-56716004b9a1a38bc7c592e47420fdb241617eb95197c22e990356941b598fd1-merged.mount: Deactivated successfully.
Dec  1 04:23:38 np0005540741 python3.9[153686]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581017.2597456-138-256856895410513/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:38 np0005540741 podman[153359]: 2025-12-01 09:23:38.278117228 +0000 UTC m=+1.393575341 container remove bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_khayyam, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  1 04:23:38 np0005540741 systemd[1]: libpod-conmon-bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8.scope: Deactivated successfully.
Dec  1 04:23:38 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:23:38 np0005540741 podman[153868]: 2025-12-01 09:23:38.879217547 +0000 UTC m=+0.061142376 container create 1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_hellman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:23:38 np0005540741 podman[153868]: 2025-12-01 09:23:38.839929293 +0000 UTC m=+0.021854152 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:23:38 np0005540741 systemd[1]: Started libpod-conmon-1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3.scope.
Dec  1 04:23:38 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:23:39 np0005540741 podman[153868]: 2025-12-01 09:23:39.066558834 +0000 UTC m=+0.248483683 container init 1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_hellman, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:23:39 np0005540741 podman[153868]: 2025-12-01 09:23:39.073195155 +0000 UTC m=+0.255119984 container start 1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_hellman, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec  1 04:23:39 np0005540741 crazy_hellman[153884]: 167 167
Dec  1 04:23:39 np0005540741 systemd[1]: libpod-1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3.scope: Deactivated successfully.
Dec  1 04:23:39 np0005540741 podman[153868]: 2025-12-01 09:23:39.093069679 +0000 UTC m=+0.274994538 container attach 1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_hellman, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec  1 04:23:39 np0005540741 podman[153868]: 2025-12-01 09:23:39.093655816 +0000 UTC m=+0.275580645 container died 1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_hellman, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True)
Dec  1 04:23:39 np0005540741 systemd[1]: var-lib-containers-storage-overlay-71dedbc496ca1be46ebe106c3d7fc491d3c73a349847d341768f93fcd372dbd2-merged.mount: Deactivated successfully.
Dec  1 04:23:39 np0005540741 podman[153868]: 2025-12-01 09:23:39.244370815 +0000 UTC m=+0.426295644 container remove 1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_hellman, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:23:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v351: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:39 np0005540741 systemd[1]: libpod-conmon-1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3.scope: Deactivated successfully.
Dec  1 04:23:39 np0005540741 python3.9[154027]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:23:39 np0005540741 podman[154033]: 2025-12-01 09:23:39.453717437 +0000 UTC m=+0.095380544 container create 404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_poincare, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:23:39 np0005540741 podman[154033]: 2025-12-01 09:23:39.380905975 +0000 UTC m=+0.022569102 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:23:39 np0005540741 systemd[1]: Started libpod-conmon-404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae.scope.
Dec  1 04:23:39 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:23:39 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3fcba4984687e969443dd1741f4d0391e1c37e3f3546a4dcfd07596d40f5984/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:23:39 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3fcba4984687e969443dd1741f4d0391e1c37e3f3546a4dcfd07596d40f5984/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:23:39 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3fcba4984687e969443dd1741f4d0391e1c37e3f3546a4dcfd07596d40f5984/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:23:39 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3fcba4984687e969443dd1741f4d0391e1c37e3f3546a4dcfd07596d40f5984/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:23:39 np0005540741 podman[154033]: 2025-12-01 09:23:39.825944969 +0000 UTC m=+0.467608086 container init 404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_poincare, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:23:39 np0005540741 podman[154033]: 2025-12-01 09:23:39.833397125 +0000 UTC m=+0.475060232 container start 404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_poincare, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:23:39 np0005540741 podman[154033]: 2025-12-01 09:23:39.837669788 +0000 UTC m=+0.479332895 container attach 404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_poincare, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True)
Dec  1 04:23:39 np0005540741 python3.9[154173]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581019.0056908-182-248107640304222/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]: {
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:    "0": [
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:        {
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "devices": [
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "/dev/loop3"
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            ],
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "lv_name": "ceph_lv0",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "lv_size": "21470642176",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "name": "ceph_lv0",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "tags": {
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.cluster_name": "ceph",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.crush_device_class": "",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.encrypted": "0",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.osd_id": "0",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.type": "block",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.vdo": "0"
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            },
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "type": "block",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "vg_name": "ceph_vg0"
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:        }
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:    ],
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:    "1": [
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:        {
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "devices": [
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "/dev/loop4"
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            ],
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "lv_name": "ceph_lv1",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "lv_size": "21470642176",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "name": "ceph_lv1",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "tags": {
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.cluster_name": "ceph",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.crush_device_class": "",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.encrypted": "0",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.osd_id": "1",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.type": "block",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.vdo": "0"
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            },
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "type": "block",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "vg_name": "ceph_vg1"
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:        }
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:    ],
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:    "2": [
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:        {
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "devices": [
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "/dev/loop5"
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            ],
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "lv_name": "ceph_lv2",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "lv_size": "21470642176",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "name": "ceph_lv2",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "tags": {
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.cluster_name": "ceph",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.crush_device_class": "",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.encrypted": "0",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.osd_id": "2",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.type": "block",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:                "ceph.vdo": "0"
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            },
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "type": "block",
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:            "vg_name": "ceph_vg2"
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:        }
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]:    ]
Dec  1 04:23:40 np0005540741 vigorous_poincare[154120]: }
Dec  1 04:23:40 np0005540741 systemd[1]: libpod-404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae.scope: Deactivated successfully.
Dec  1 04:23:40 np0005540741 podman[154033]: 2025-12-01 09:23:40.603693549 +0000 UTC m=+1.245356686 container died 404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_poincare, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec  1 04:23:41 np0005540741 systemd[1]: var-lib-containers-storage-overlay-a3fcba4984687e969443dd1741f4d0391e1c37e3f3546a4dcfd07596d40f5984-merged.mount: Deactivated successfully.
Dec  1 04:23:41 np0005540741 podman[154033]: 2025-12-01 09:23:41.07344179 +0000 UTC m=+1.715104887 container remove 404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_poincare, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:23:41 np0005540741 python3.9[154325]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:23:41 np0005540741 systemd[1]: libpod-conmon-404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae.scope: Deactivated successfully.
Dec  1 04:23:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v352: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:41 np0005540741 python3.9[154564]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581020.1299899-182-187927865733140/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:41 np0005540741 podman[154605]: 2025-12-01 09:23:41.629402425 +0000 UTC m=+0.044355734 container create 9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_heyrovsky, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef)
Dec  1 04:23:41 np0005540741 systemd[1]: Started libpod-conmon-9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90.scope.
Dec  1 04:23:41 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:23:41 np0005540741 podman[154605]: 2025-12-01 09:23:41.605890151 +0000 UTC m=+0.020843480 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:23:41 np0005540741 podman[154605]: 2025-12-01 09:23:41.71595213 +0000 UTC m=+0.130905449 container init 9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_heyrovsky, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:23:41 np0005540741 podman[154605]: 2025-12-01 09:23:41.725097709 +0000 UTC m=+0.140051018 container start 9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_heyrovsky, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:23:41 np0005540741 podman[154605]: 2025-12-01 09:23:41.729245156 +0000 UTC m=+0.144198555 container attach 9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_heyrovsky, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507)
Dec  1 04:23:41 np0005540741 admiring_heyrovsky[154644]: 167 167
Dec  1 04:23:41 np0005540741 systemd[1]: libpod-9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90.scope: Deactivated successfully.
Dec  1 04:23:41 np0005540741 podman[154605]: 2025-12-01 09:23:41.736672516 +0000 UTC m=+0.151625845 container died 9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_heyrovsky, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec  1 04:23:41 np0005540741 systemd[1]: var-lib-containers-storage-overlay-c08938f199fd711aad54d76a27632c7339101eeb290083492db8abadd11597a5-merged.mount: Deactivated successfully.
Dec  1 04:23:41 np0005540741 podman[154605]: 2025-12-01 09:23:41.773940498 +0000 UTC m=+0.188893807 container remove 9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_heyrovsky, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:23:41 np0005540741 systemd[1]: libpod-conmon-9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90.scope: Deactivated successfully.
Dec  1 04:23:41 np0005540741 podman[154721]: 2025-12-01 09:23:41.944675792 +0000 UTC m=+0.050001404 container create 887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_liskov, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:23:42 np0005540741 podman[154721]: 2025-12-01 09:23:41.922204767 +0000 UTC m=+0.027530399 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:23:42 np0005540741 systemd[1]: Started libpod-conmon-887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5.scope.
Dec  1 04:23:42 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:23:42 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e272aaada3acd3a7ee19a0c40cdf63e9b3d03732d7313aeb9d26a4a90ac7a0ce/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:23:42 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e272aaada3acd3a7ee19a0c40cdf63e9b3d03732d7313aeb9d26a4a90ac7a0ce/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:23:42 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e272aaada3acd3a7ee19a0c40cdf63e9b3d03732d7313aeb9d26a4a90ac7a0ce/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:23:42 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e272aaada3acd3a7ee19a0c40cdf63e9b3d03732d7313aeb9d26a4a90ac7a0ce/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:23:42 np0005540741 podman[154721]: 2025-12-01 09:23:42.216310185 +0000 UTC m=+0.321635827 container init 887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_liskov, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef)
Dec  1 04:23:42 np0005540741 podman[154721]: 2025-12-01 09:23:42.228140509 +0000 UTC m=+0.333466121 container start 887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_liskov, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3)
Dec  1 04:23:42 np0005540741 podman[154721]: 2025-12-01 09:23:42.232217314 +0000 UTC m=+0.337542926 container attach 887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_liskov, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec  1 04:23:42 np0005540741 python3.9[154807]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:23:42 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:42Z|00025|memory|INFO|17664 kB peak resident set size after 30.1 seconds
Dec  1 04:23:42 np0005540741 ovn_controller[149914]: 2025-12-01T09:23:42Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Dec  1 04:23:42 np0005540741 podman[154940]: 2025-12-01 09:23:42.915008302 +0000 UTC m=+0.094915192 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 04:23:43 np0005540741 python3.9[154989]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:23:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:23:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:23:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:23:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:23:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]: {
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:        "osd_id": 0,
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:        "type": "bluestore"
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:    },
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:        "osd_id": 1,
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:        "type": "bluestore"
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:    },
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:        "osd_id": 2,
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:        "type": "bluestore"
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]:    }
Dec  1 04:23:43 np0005540741 gracious_liskov[154810]: }
Dec  1 04:23:43 np0005540741 systemd[1]: libpod-887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5.scope: Deactivated successfully.
Dec  1 04:23:43 np0005540741 systemd[1]: libpod-887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5.scope: Consumed 1.034s CPU time.
Dec  1 04:23:43 np0005540741 podman[154721]: 2025-12-01 09:23:43.258853705 +0000 UTC m=+1.364179337 container died 887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_liskov, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True)
Dec  1 04:23:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v353: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:43 np0005540741 systemd[1]: var-lib-containers-storage-overlay-e272aaada3acd3a7ee19a0c40cdf63e9b3d03732d7313aeb9d26a4a90ac7a0ce-merged.mount: Deactivated successfully.
Dec  1 04:23:43 np0005540741 podman[154721]: 2025-12-01 09:23:43.316041821 +0000 UTC m=+1.421367423 container remove 887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_liskov, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:23:43 np0005540741 systemd[1]: libpod-conmon-887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5.scope: Deactivated successfully.
Dec  1 04:23:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:23:43 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:23:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:23:43 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:23:43 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 0c76680d-f4e5-4ca4-8d79-eafe7aee3f0a does not exist
Dec  1 04:23:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:23:43 np0005540741 python3.9[155237]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:23:44 np0005540741 python3.9[155315]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:44 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:23:44 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:23:44 np0005540741 python3.9[155467]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:23:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v354: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:45 np0005540741 python3.9[155545]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:45 np0005540741 python3.9[155697]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:23:46 np0005540741 python3.9[155849]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:23:47 np0005540741 python3.9[155927]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:23:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v355: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:47 np0005540741 python3.9[156079]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:23:48 np0005540741 python3.9[156157]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:23:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:23:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v356: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:49 np0005540741 python3.9[156309]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:23:49 np0005540741 systemd[1]: Reloading.
Dec  1 04:23:49 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:23:49 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:23:50 np0005540741 python3.9[156498]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:23:50 np0005540741 python3.9[156576]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:23:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v357: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:51 np0005540741 python3.9[156728]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:23:51 np0005540741 python3.9[156806]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:23:52 np0005540741 python3.9[156958]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:23:52 np0005540741 systemd[1]: Reloading.
Dec  1 04:23:53 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:23:53 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:23:53 np0005540741 systemd[1]: Starting Create netns directory...
Dec  1 04:23:53 np0005540741 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  1 04:23:53 np0005540741 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  1 04:23:53 np0005540741 systemd[1]: Finished Create netns directory.
Dec  1 04:23:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v358: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:23:54 np0005540741 python3.9[157151]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:54 np0005540741 python3.9[157303]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:23:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v359: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:55 np0005540741 python3.9[157426]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581034.3100777-333-154824419591383/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:56 np0005540741 python3.9[157578]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:23:57 np0005540741 python3.9[157730]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:23:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v360: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:23:57 np0005540741 python3.9[157853]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581036.627149-358-267172713260222/.source.json _original_basename=._8ma83r7 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:23:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:23:58 np0005540741 python3.9[158005]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:23:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v361: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:00 np0005540741 python3.9[158432]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec  1 04:24:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v362: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:01 np0005540741 python3.9[158584]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  1 04:24:02 np0005540741 python3.9[158736]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  1 04:24:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v363: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:24:04 np0005540741 python3[158914]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  1 04:24:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v364: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v365: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:24:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v366: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:10 np0005540741 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:24:10 np0005540741 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 4208 writes, 19K keys, 4208 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 4208 writes, 369 syncs, 11.40 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4208 writes, 19K keys, 4208 commit groups, 1.0 writes per commit group, ingest: 15.93 MB, 0.03 MB/s#012Interval WAL: 4208 writes, 369 syncs, 11.40 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Dec  1 04:24:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v367: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:24:12
Dec  1 04:24:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:24:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:24:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'cephfs.cephfs.meta', 'cephfs.cephfs.data', 'vms', '.mgr', 'images', 'volumes']
Dec  1 04:24:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:24:13 np0005540741 podman[158929]: 2025-12-01 09:24:13.181098474 +0000 UTC m=+8.647860806 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 04:24:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v368: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:13 np0005540741 podman[159055]: 2025-12-01 09:24:13.360392029 +0000 UTC m=+0.055227171 container create 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec  1 04:24:13 np0005540741 podman[159055]: 2025-12-01 09:24:13.331017869 +0000 UTC m=+0.025853021 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 04:24:13 np0005540741 python3[158914]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 04:24:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:24:13 np0005540741 podman[159212]: 2025-12-01 09:24:13.996207401 +0000 UTC m=+0.094624204 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  1 04:24:14 np0005540741 python3.9[159264]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:24:14 np0005540741 python3.9[159424]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v369: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:15 np0005540741 python3.9[159500]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:24:15 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:24:15 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.2 total, 600.0 interval#012Cumulative writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 16.04 MB, 0.03 MB/s#012Interval WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Dec  1 04:24:16 np0005540741 python3.9[159651]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764581055.5203018-446-273642022239231/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:16 np0005540741 python3.9[159727]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 04:24:16 np0005540741 systemd[1]: Reloading.
Dec  1 04:24:16 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:24:16 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:24:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v370: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:17 np0005540741 python3.9[159837]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:24:17 np0005540741 systemd[1]: Reloading.
Dec  1 04:24:17 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:24:17 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:24:18 np0005540741 systemd[1]: Starting ovn_metadata_agent container...
Dec  1 04:24:18 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:24:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c2505faea1fa7dfc6f89e3da507131bc1f3625ef975dfc2e3193dde241ea379/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec  1 04:24:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c2505faea1fa7dfc6f89e3da507131bc1f3625ef975dfc2e3193dde241ea379/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  1 04:24:18 np0005540741 systemd[1]: Started /usr/bin/podman healthcheck run 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba.
Dec  1 04:24:18 np0005540741 podman[159878]: 2025-12-01 09:24:18.43488009 +0000 UTC m=+0.377682081 container init 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: + sudo -E kolla_set_configs
Dec  1 04:24:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:24:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:24:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:24:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:24:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:24:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:24:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:24:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:24:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:24:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:24:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:24:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:24:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:24:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:24:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:24:18 np0005540741 podman[159878]: 2025-12-01 09:24:18.465750552 +0000 UTC m=+0.408552443 container start 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  1 04:24:18 np0005540741 edpm-start-podman-container[159878]: ovn_metadata_agent
Dec  1 04:24:18 np0005540741 edpm-start-podman-container[159877]: Creating additional drop-in dependency for "ovn_metadata_agent" (195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba)
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: INFO:__main__:Validating config file
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: INFO:__main__:Copying service configuration files
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: INFO:__main__:Writing out command to execute
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /var/lib/neutron
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: ++ cat /run_command
Dec  1 04:24:18 np0005540741 systemd[1]: Reloading.
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: + CMD=neutron-ovn-metadata-agent
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: + ARGS=
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: + sudo kolla_copy_cacerts
Dec  1 04:24:18 np0005540741 podman[159900]: 2025-12-01 09:24:18.560309973 +0000 UTC m=+0.082101700 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent)
Dec  1 04:24:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: + [[ ! -n '' ]]
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: + . kolla_extend_start
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: Running command: 'neutron-ovn-metadata-agent'
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: + umask 0022
Dec  1 04:24:18 np0005540741 ovn_metadata_agent[159893]: + exec neutron-ovn-metadata-agent
Dec  1 04:24:18 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:24:18 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:24:18 np0005540741 systemd[1]: Started ovn_metadata_agent container.
Dec  1 04:24:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v371: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:19 np0005540741 systemd[1]: session-48.scope: Deactivated successfully.
Dec  1 04:24:19 np0005540741 systemd[1]: session-48.scope: Consumed 57.378s CPU time.
Dec  1 04:24:19 np0005540741 systemd-logind[788]: Session 48 logged out. Waiting for processes to exit.
Dec  1 04:24:19 np0005540741 systemd-logind[788]: Removed session 48.
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.418 159899 INFO neutron.common.config [-] Logging enabled!#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.418 159899 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.418 159899 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.453 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.453 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.453 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.453 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.453 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.462 159899 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.462 159899 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.462 159899 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.462 159899 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.462 159899 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.475 159899 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name a8013a17-6378-4c2f-a5de-9d3b29c7a42e (UUID: a8013a17-6378-4c2f-a5de-9d3b29c7a42e) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.505 159899 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.505 159899 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.505 159899 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.505 159899 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.510 159899 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.516 159899 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.523 159899 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'a8013a17-6378-4c2f-a5de-9d3b29c7a42e'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7ff6ad027af0>], external_ids={}, name=a8013a17-6378-4c2f-a5de-9d3b29c7a42e, nb_cfg_timestamp=1764581000829, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.524 159899 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7ff6ad02ab20>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.525 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.525 159899 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.526 159899 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.526 159899 INFO oslo_service.service [-] Starting 1 workers#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.533 159899 DEBUG oslo_service.service [-] Started child 160008 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.537 159899 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpdz57tbq9/privsep.sock']#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.537 160008 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-166554'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.562 160008 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.562 160008 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.562 160008 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.569 160008 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.576 160008 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  1 04:24:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.582 160008 INFO eventlet.wsgi.server [-] (160008) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Dec  1 04:24:21 np0005540741 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec  1 04:24:21 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.259 159899 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  1 04:24:21 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.260 159899 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpdz57tbq9/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  1 04:24:21 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.144 160013 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  1 04:24:21 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.149 160013 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  1 04:24:21 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.152 160013 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Dec  1 04:24:21 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.152 160013 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160013#033[00m
Dec  1 04:24:21 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.263 160013 DEBUG oslo.privsep.daemon [-] privsep: reply[19cd340e-88d2-4d9a-886a-294f0436eae8]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 04:24:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v372: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:21 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.792 160013 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:24:21 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.792 160013 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:24:21 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.793 160013 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.410 160013 DEBUG oslo.privsep.daemon [-] privsep: reply[f673a4e7-3009-4b27-8769-e67efc72cbca]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.413 159899 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=a8013a17-6378-4c2f-a5de-9d3b29c7a42e, column=external_ids, values=({'neutron:ovn-metadata-id': '7ca6876a-62db-5b7a-a446-404679c57fc8'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.428 159899 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a8013a17-6378-4c2f-a5de-9d3b29c7a42e, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.435 159899 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.436 159899 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.436 159899 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.436 159899 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.436 159899 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.437 159899 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.437 159899 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.437 159899 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.438 159899 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.438 159899 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.438 159899 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.439 159899 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.439 159899 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.439 159899 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.440 159899 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.440 159899 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.440 159899 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.441 159899 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.441 159899 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.441 159899 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.442 159899 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.442 159899 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.442 159899 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.443 159899 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.443 159899 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.443 159899 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.444 159899 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.444 159899 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.444 159899 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.445 159899 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.445 159899 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.445 159899 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.446 159899 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.446 159899 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.446 159899 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.447 159899 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.447 159899 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.447 159899 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.448 159899 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.448 159899 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.448 159899 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.449 159899 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.449 159899 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.449 159899 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.450 159899 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.450 159899 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.450 159899 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.450 159899 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.450 159899 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.450 159899 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.451 159899 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.451 159899 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.451 159899 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.451 159899 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.451 159899 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.452 159899 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.452 159899 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.452 159899 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.452 159899 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.452 159899 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.453 159899 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.453 159899 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.453 159899 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.453 159899 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.453 159899 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.454 159899 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.454 159899 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.454 159899 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.454 159899 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.455 159899 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.455 159899 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.455 159899 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.455 159899 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.455 159899 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.456 159899 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.456 159899 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.456 159899 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.456 159899 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.456 159899 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.457 159899 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.457 159899 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.457 159899 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.457 159899 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.457 159899 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.458 159899 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.458 159899 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.458 159899 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.458 159899 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.458 159899 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.458 159899 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.459 159899 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.459 159899 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.459 159899 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.459 159899 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.459 159899 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.459 159899 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.460 159899 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.460 159899 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.460 159899 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.460 159899 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.460 159899 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.460 159899 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.460 159899 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.461 159899 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.461 159899 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.461 159899 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.461 159899 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.461 159899 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.462 159899 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.462 159899 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.462 159899 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.462 159899 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.462 159899 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.463 159899 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.463 159899 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.463 159899 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.463 159899 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.464 159899 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.464 159899 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.464 159899 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.464 159899 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.464 159899 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.464 159899 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.465 159899 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.465 159899 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.465 159899 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.465 159899 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.465 159899 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.465 159899 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.466 159899 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.466 159899 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.466 159899 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.466 159899 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.466 159899 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.467 159899 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.467 159899 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.467 159899 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.467 159899 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.467 159899 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.467 159899 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.468 159899 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.468 159899 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.468 159899 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.468 159899 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.468 159899 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.468 159899 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.468 159899 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.469 159899 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.469 159899 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.469 159899 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.469 159899 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.469 159899 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.469 159899 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.470 159899 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.470 159899 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.470 159899 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.470 159899 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.470 159899 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.470 159899 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.470 159899 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.471 159899 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.471 159899 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.471 159899 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.471 159899 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.471 159899 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.471 159899 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.472 159899 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.472 159899 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.472 159899 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.472 159899 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.472 159899 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.472 159899 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.473 159899 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.473 159899 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.473 159899 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.473 159899 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.473 159899 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.473 159899 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.473 159899 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.474 159899 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.474 159899 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.474 159899 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.474 159899 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.474 159899 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.475 159899 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.475 159899 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.475 159899 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.475 159899 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.475 159899 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.476 159899 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.476 159899 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.476 159899 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.476 159899 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.476 159899 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.476 159899 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.477 159899 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.477 159899 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.477 159899 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.477 159899 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.477 159899 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.477 159899 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.478 159899 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.478 159899 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.478 159899 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.478 159899 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.478 159899 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.478 159899 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.478 159899 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.479 159899 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.479 159899 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.479 159899 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.479 159899 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.479 159899 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.479 159899 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.479 159899 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.480 159899 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.480 159899 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.480 159899 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.480 159899 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.480 159899 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.480 159899 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.480 159899 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.481 159899 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.481 159899 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.481 159899 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.481 159899 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.481 159899 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.481 159899 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.481 159899 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.482 159899 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.482 159899 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.482 159899 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.482 159899 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.482 159899 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.483 159899 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.483 159899 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.483 159899 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.483 159899 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.483 159899 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.484 159899 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.484 159899 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.484 159899 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.484 159899 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.484 159899 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.486 159899 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.486 159899 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.486 159899 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.486 159899 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.486 159899 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.486 159899 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.486 159899 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.488 159899 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.488 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.488 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.488 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.488 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.488 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.488 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.490 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.490 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.490 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.490 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.490 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.490 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.490 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.492 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.492 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.492 159899 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.492 159899 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.492 159899 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.492 159899 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:24:22 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.492 159899 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  1 04:24:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v373: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:24:24 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:24:24 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.4 total, 600.0 interval#012Cumulative writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 15.87 MB, 0.03 MB/s#012Interval WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.4 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.4 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.4 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Dec  1 04:24:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v374: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:25 np0005540741 systemd-logind[788]: New session 49 of user zuul.
Dec  1 04:24:25 np0005540741 systemd[1]: Started Session 49 of User zuul.
Dec  1 04:24:26 np0005540741 python3.9[160172]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:24:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v375: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:27 np0005540741 python3.9[160330]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:24:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:24:28 np0005540741 ceph-mgr[75324]: [devicehealth INFO root] Check health
Dec  1 04:24:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v376: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:29 np0005540741 python3.9[160493]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 04:24:29 np0005540741 systemd[1]: Reloading.
Dec  1 04:24:29 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:24:29 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:24:30 np0005540741 python3.9[160679]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:24:30 np0005540741 network[160696]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:24:30 np0005540741 network[160697]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:24:30 np0005540741 network[160698]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:24:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v377: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v378: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:24:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v379: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:35 np0005540741 python3.9[160960]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:24:36 np0005540741 python3.9[161113]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:24:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v380: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:37 np0005540741 python3.9[161266]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:24:38 np0005540741 python3.9[161419]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:24:38 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:24:38 np0005540741 python3.9[161572]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:24:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v381: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:39 np0005540741 python3.9[161725]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:24:40 np0005540741 python3.9[161878]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:24:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v382: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:41 np0005540741 python3.9[162031]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:42 np0005540741 python3.9[162183]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:42 np0005540741 python3.9[162335]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:24:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:24:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:24:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:24:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:24:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:24:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v383: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:24:43 np0005540741 python3.9[162487]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} v 0) v1
Dec  1 04:24:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 04:24:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:24:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:24:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:24:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:24:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:24:44 np0005540741 podman[162756]: 2025-12-01 09:24:44.253744257 +0000 UTC m=+0.162898560 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Dec  1 04:24:44 np0005540741 python3.9[162759]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:24:44 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 9bf0fb69-ad03-41da-9bb2-a1dc5b800832 does not exist
Dec  1 04:24:44 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev a4817f29-25fe-4abc-8f2f-768563498f2e does not exist
Dec  1 04:24:44 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev c89a7ad3-c712-4738-832f-80d64b4a3f34 does not exist
Dec  1 04:24:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:24:45 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:24:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:24:45 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:24:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:24:45 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:24:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 04:24:45 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:24:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v384: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:45 np0005540741 podman[163096]: 2025-12-01 09:24:45.758050067 +0000 UTC m=+0.019870624 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:24:45 np0005540741 podman[163096]: 2025-12-01 09:24:45.887868006 +0000 UTC m=+0.149688543 container create 033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_williams, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:24:45 np0005540741 systemd[1]: Started libpod-conmon-033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4.scope.
Dec  1 04:24:45 np0005540741 python3.9[162954]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:45 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:24:45 np0005540741 podman[163096]: 2025-12-01 09:24:45.970325701 +0000 UTC m=+0.232146238 container init 033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_williams, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec  1 04:24:45 np0005540741 podman[163096]: 2025-12-01 09:24:45.97871852 +0000 UTC m=+0.240539077 container start 033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_williams, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  1 04:24:45 np0005540741 podman[163096]: 2025-12-01 09:24:45.982588376 +0000 UTC m=+0.244408943 container attach 033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_williams, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Dec  1 04:24:45 np0005540741 busy_williams[163112]: 167 167
Dec  1 04:24:45 np0005540741 systemd[1]: libpod-033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4.scope: Deactivated successfully.
Dec  1 04:24:45 np0005540741 podman[163096]: 2025-12-01 09:24:45.98419299 +0000 UTC m=+0.246013527 container died 033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_williams, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3)
Dec  1 04:24:46 np0005540741 systemd[1]: var-lib-containers-storage-overlay-3167aad26c8120ddb5130faefb048531f1d49e74dec54f29c524c62e11012c6c-merged.mount: Deactivated successfully.
Dec  1 04:24:46 np0005540741 podman[163096]: 2025-12-01 09:24:46.022111157 +0000 UTC m=+0.283931694 container remove 033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_williams, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec  1 04:24:46 np0005540741 systemd[1]: libpod-conmon-033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4.scope: Deactivated successfully.
Dec  1 04:24:46 np0005540741 podman[163195]: 2025-12-01 09:24:46.177490875 +0000 UTC m=+0.043602763 container create 28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_heisenberg, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec  1 04:24:46 np0005540741 systemd[1]: Started libpod-conmon-28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb.scope.
Dec  1 04:24:46 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:24:46 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:24:46 np0005540741 podman[163195]: 2025-12-01 09:24:46.161721684 +0000 UTC m=+0.027833592 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:24:46 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:24:46 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07f1e020e1a732e2e67bb828616447b0b43cc8581e448dae93d82fd867d7224a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:24:46 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07f1e020e1a732e2e67bb828616447b0b43cc8581e448dae93d82fd867d7224a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:24:46 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07f1e020e1a732e2e67bb828616447b0b43cc8581e448dae93d82fd867d7224a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:24:46 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07f1e020e1a732e2e67bb828616447b0b43cc8581e448dae93d82fd867d7224a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:24:46 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07f1e020e1a732e2e67bb828616447b0b43cc8581e448dae93d82fd867d7224a/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:24:46 np0005540741 podman[163195]: 2025-12-01 09:24:46.277703855 +0000 UTC m=+0.143815783 container init 28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_heisenberg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Dec  1 04:24:46 np0005540741 podman[163195]: 2025-12-01 09:24:46.287429811 +0000 UTC m=+0.153541709 container start 28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_heisenberg, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec  1 04:24:46 np0005540741 podman[163195]: 2025-12-01 09:24:46.291023799 +0000 UTC m=+0.157135707 container attach 28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_heisenberg, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec  1 04:24:46 np0005540741 python3.9[163307]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:47 np0005540741 python3.9[163467]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:47 np0005540741 boring_heisenberg[163250]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:24:47 np0005540741 boring_heisenberg[163250]: --> relative data size: 1.0
Dec  1 04:24:47 np0005540741 boring_heisenberg[163250]: --> All data devices are unavailable
Dec  1 04:24:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v385: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:47 np0005540741 systemd[1]: libpod-28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb.scope: Deactivated successfully.
Dec  1 04:24:47 np0005540741 systemd[1]: libpod-28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb.scope: Consumed 1.010s CPU time.
Dec  1 04:24:47 np0005540741 podman[163195]: 2025-12-01 09:24:47.350832204 +0000 UTC m=+1.216944102 container died 28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_heisenberg, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:24:47 np0005540741 systemd[1]: var-lib-containers-storage-overlay-07f1e020e1a732e2e67bb828616447b0b43cc8581e448dae93d82fd867d7224a-merged.mount: Deactivated successfully.
Dec  1 04:24:47 np0005540741 podman[163195]: 2025-12-01 09:24:47.430337128 +0000 UTC m=+1.296449046 container remove 28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_heisenberg, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec  1 04:24:47 np0005540741 systemd[1]: libpod-conmon-28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb.scope: Deactivated successfully.
Dec  1 04:24:47 np0005540741 python3.9[163747]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:48 np0005540741 podman[163812]: 2025-12-01 09:24:48.014925471 +0000 UTC m=+0.049192066 container create 3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_cartwright, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec  1 04:24:48 np0005540741 systemd[1]: Started libpod-conmon-3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf.scope.
Dec  1 04:24:48 np0005540741 podman[163812]: 2025-12-01 09:24:47.990948206 +0000 UTC m=+0.025214831 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:24:48 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:24:48 np0005540741 podman[163812]: 2025-12-01 09:24:48.103258936 +0000 UTC m=+0.137525531 container init 3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_cartwright, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:24:48 np0005540741 podman[163812]: 2025-12-01 09:24:48.109697552 +0000 UTC m=+0.143964147 container start 3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_cartwright, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec  1 04:24:48 np0005540741 podman[163812]: 2025-12-01 09:24:48.11288821 +0000 UTC m=+0.147154805 container attach 3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_cartwright, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:24:48 np0005540741 hungry_cartwright[163851]: 167 167
Dec  1 04:24:48 np0005540741 systemd[1]: libpod-3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf.scope: Deactivated successfully.
Dec  1 04:24:48 np0005540741 podman[163812]: 2025-12-01 09:24:48.11362428 +0000 UTC m=+0.147890875 container died 3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_cartwright, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  1 04:24:48 np0005540741 systemd[1]: var-lib-containers-storage-overlay-698419201c78f695b168d0b0f88f7fbb6acfe1260a328e40f46878484a265c4c-merged.mount: Deactivated successfully.
Dec  1 04:24:48 np0005540741 podman[163812]: 2025-12-01 09:24:48.149135321 +0000 UTC m=+0.183401916 container remove 3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_cartwright, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:24:48 np0005540741 systemd[1]: libpod-conmon-3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf.scope: Deactivated successfully.
Dec  1 04:24:48 np0005540741 podman[163957]: 2025-12-01 09:24:48.298954897 +0000 UTC m=+0.039154472 container create 796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_taussig, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec  1 04:24:48 np0005540741 systemd[1]: Started libpod-conmon-796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d.scope.
Dec  1 04:24:48 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:24:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33abc402fbff47d6f65d10ddc4a5ea1eecbbb6fe8e9a4009a6a3e278e41c1946/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:24:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33abc402fbff47d6f65d10ddc4a5ea1eecbbb6fe8e9a4009a6a3e278e41c1946/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:24:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33abc402fbff47d6f65d10ddc4a5ea1eecbbb6fe8e9a4009a6a3e278e41c1946/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:24:48 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33abc402fbff47d6f65d10ddc4a5ea1eecbbb6fe8e9a4009a6a3e278e41c1946/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:24:48 np0005540741 podman[163957]: 2025-12-01 09:24:48.376132487 +0000 UTC m=+0.116332142 container init 796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_taussig, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Dec  1 04:24:48 np0005540741 podman[163957]: 2025-12-01 09:24:48.281829799 +0000 UTC m=+0.022029404 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:24:48 np0005540741 podman[163957]: 2025-12-01 09:24:48.384793504 +0000 UTC m=+0.124993079 container start 796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_taussig, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec  1 04:24:48 np0005540741 podman[163957]: 2025-12-01 09:24:48.43256384 +0000 UTC m=+0.172763445 container attach 796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_taussig, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:24:48 np0005540741 python3.9[163992]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:24:49 np0005540741 podman[164123]: 2025-12-01 09:24:49.030134538 +0000 UTC m=+0.119408296 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]: {
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:    "0": [
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:        {
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "devices": [
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "/dev/loop3"
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            ],
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "lv_name": "ceph_lv0",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "lv_size": "21470642176",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "name": "ceph_lv0",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "tags": {
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.cluster_name": "ceph",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.crush_device_class": "",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.encrypted": "0",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.osd_id": "0",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.type": "block",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.vdo": "0"
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            },
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "type": "block",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "vg_name": "ceph_vg0"
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:        }
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:    ],
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:    "1": [
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:        {
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "devices": [
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "/dev/loop4"
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            ],
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "lv_name": "ceph_lv1",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "lv_size": "21470642176",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "name": "ceph_lv1",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "tags": {
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.cluster_name": "ceph",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.crush_device_class": "",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.encrypted": "0",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.osd_id": "1",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.type": "block",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.vdo": "0"
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            },
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "type": "block",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "vg_name": "ceph_vg1"
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:        }
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:    ],
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:    "2": [
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:        {
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "devices": [
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "/dev/loop5"
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            ],
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "lv_name": "ceph_lv2",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "lv_size": "21470642176",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "name": "ceph_lv2",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "tags": {
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.cluster_name": "ceph",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.crush_device_class": "",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.encrypted": "0",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.osd_id": "2",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.type": "block",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:                "ceph.vdo": "0"
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            },
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "type": "block",
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:            "vg_name": "ceph_vg2"
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:        }
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]:    ]
Dec  1 04:24:49 np0005540741 youthful_taussig[163995]: }
Dec  1 04:24:49 np0005540741 systemd[1]: libpod-796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d.scope: Deactivated successfully.
Dec  1 04:24:49 np0005540741 podman[163957]: 2025-12-01 09:24:49.178878605 +0000 UTC m=+0.919078190 container died 796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_taussig, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:24:49 np0005540741 python3.9[164169]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v386: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:49 np0005540741 systemd[1]: var-lib-containers-storage-overlay-33abc402fbff47d6f65d10ddc4a5ea1eecbbb6fe8e9a4009a6a3e278e41c1946-merged.mount: Deactivated successfully.
Dec  1 04:24:49 np0005540741 podman[163957]: 2025-12-01 09:24:49.550847075 +0000 UTC m=+1.291046650 container remove 796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_taussig, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:24:49 np0005540741 systemd[1]: libpod-conmon-796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d.scope: Deactivated successfully.
Dec  1 04:24:49 np0005540741 python3.9[164338]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:50 np0005540741 podman[164614]: 2025-12-01 09:24:50.232677137 +0000 UTC m=+0.051907371 container create 41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_nobel, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec  1 04:24:50 np0005540741 systemd[1]: Started libpod-conmon-41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de.scope.
Dec  1 04:24:50 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:24:50 np0005540741 podman[164614]: 2025-12-01 09:24:50.203273863 +0000 UTC m=+0.022504077 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:24:50 np0005540741 podman[164614]: 2025-12-01 09:24:50.316283393 +0000 UTC m=+0.135513627 container init 41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_nobel, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:24:50 np0005540741 podman[164614]: 2025-12-01 09:24:50.3267807 +0000 UTC m=+0.146010924 container start 41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_nobel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec  1 04:24:50 np0005540741 podman[164614]: 2025-12-01 09:24:50.330886432 +0000 UTC m=+0.150116666 container attach 41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_nobel, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:24:50 np0005540741 affectionate_nobel[164647]: 167 167
Dec  1 04:24:50 np0005540741 systemd[1]: libpod-41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de.scope: Deactivated successfully.
Dec  1 04:24:50 np0005540741 podman[164614]: 2025-12-01 09:24:50.334782088 +0000 UTC m=+0.154012332 container died 41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_nobel, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec  1 04:24:50 np0005540741 systemd[1]: var-lib-containers-storage-overlay-f068f6e84383dd90d8e92dbcaa7c77d47b810b45c16d1a9d179be7beb8de76aa-merged.mount: Deactivated successfully.
Dec  1 04:24:50 np0005540741 podman[164614]: 2025-12-01 09:24:50.397094952 +0000 UTC m=+0.216325186 container remove 41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_nobel, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  1 04:24:50 np0005540741 python3.9[164644]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:50 np0005540741 systemd[1]: libpod-conmon-41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de.scope: Deactivated successfully.
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #24. Immutable memtables: 0.
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.460382) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 24
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581090460406, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 1524, "num_deletes": 251, "total_data_size": 1671699, "memory_usage": 1699968, "flush_reason": "Manual Compaction"}
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #25: started
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581090475222, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 25, "file_size": 1629622, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7458, "largest_seqno": 8981, "table_properties": {"data_size": 1622607, "index_size": 4090, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 13908, "raw_average_key_size": 19, "raw_value_size": 1608537, "raw_average_value_size": 2206, "num_data_blocks": 192, "num_entries": 729, "num_filter_entries": 729, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580924, "oldest_key_time": 1764580924, "file_creation_time": 1764581090, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 25, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 14872 microseconds, and 4688 cpu microseconds.
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.475258) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #25: 1629622 bytes OK
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.475271) [db/memtable_list.cc:519] [default] Level-0 commit table #25 started
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.477268) [db/memtable_list.cc:722] [default] Level-0 commit table #25: memtable #1 done
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.477282) EVENT_LOG_v1 {"time_micros": 1764581090477278, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.477317) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1665052, prev total WAL file size 1665052, number of live WAL files 2.
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000021.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.477823) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [25(1591KB)], [23(4106KB)]
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581090477850, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [25], "files_L6": [23], "score": -1, "input_data_size": 5834408, "oldest_snapshot_seqno": -1}
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #26: 2786 keys, 4626085 bytes, temperature: kUnknown
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581090508870, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 26, "file_size": 4626085, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 4605363, "index_size": 12677, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 6981, "raw_key_size": 64803, "raw_average_key_size": 23, "raw_value_size": 4553249, "raw_average_value_size": 1634, "num_data_blocks": 570, "num_entries": 2786, "num_filter_entries": 2786, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764581090, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.509044) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 4626085 bytes
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.510455) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 187.8 rd, 148.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 4.0 +0.0 blob) out(4.4 +0.0 blob), read-write-amplify(6.4) write-amplify(2.8) OK, records in: 3300, records dropped: 514 output_compression: NoCompression
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.510470) EVENT_LOG_v1 {"time_micros": 1764581090510462, "job": 8, "event": "compaction_finished", "compaction_time_micros": 31072, "compaction_time_cpu_micros": 12548, "output_level": 6, "num_output_files": 1, "total_output_size": 4626085, "num_input_records": 3300, "num_output_records": 2786, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000025.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581090510797, "job": 8, "event": "table_file_deletion", "file_number": 25}
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581090511489, "job": 8, "event": "table_file_deletion", "file_number": 23}
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.477757) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.511524) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.511529) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.511531) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.511532) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:24:50 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.511533) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:24:50 np0005540741 podman[164697]: 2025-12-01 09:24:50.576651011 +0000 UTC m=+0.046183673 container create 3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:24:50 np0005540741 systemd[1]: Started libpod-conmon-3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da.scope.
Dec  1 04:24:50 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:24:50 np0005540741 podman[164697]: 2025-12-01 09:24:50.557173589 +0000 UTC m=+0.026706291 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:24:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bc67315b39f5de3be56dd265f5c237673c0625fb35a83f434336d9dc37ea5b9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:24:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bc67315b39f5de3be56dd265f5c237673c0625fb35a83f434336d9dc37ea5b9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:24:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bc67315b39f5de3be56dd265f5c237673c0625fb35a83f434336d9dc37ea5b9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:24:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bc67315b39f5de3be56dd265f5c237673c0625fb35a83f434336d9dc37ea5b9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:24:50 np0005540741 podman[164697]: 2025-12-01 09:24:50.668561494 +0000 UTC m=+0.138094186 container init 3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default)
Dec  1 04:24:50 np0005540741 podman[164697]: 2025-12-01 09:24:50.679141013 +0000 UTC m=+0.148673685 container start 3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec  1 04:24:50 np0005540741 podman[164697]: 2025-12-01 09:24:50.682953998 +0000 UTC m=+0.152486690 container attach 3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True)
Dec  1 04:24:51 np0005540741 python3.9[164845]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:24:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v387: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]: {
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:        "osd_id": 0,
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:        "type": "bluestore"
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:    },
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:        "osd_id": 1,
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:        "type": "bluestore"
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:    },
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:        "osd_id": 2,
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:        "type": "bluestore"
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]:    }
Dec  1 04:24:51 np0005540741 upbeat_goldberg[164745]: }
Dec  1 04:24:51 np0005540741 systemd[1]: libpod-3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da.scope: Deactivated successfully.
Dec  1 04:24:51 np0005540741 podman[164697]: 2025-12-01 09:24:51.752397866 +0000 UTC m=+1.221930548 container died 3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:24:51 np0005540741 systemd[1]: libpod-3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da.scope: Consumed 1.072s CPU time.
Dec  1 04:24:51 np0005540741 systemd[1]: var-lib-containers-storage-overlay-0bc67315b39f5de3be56dd265f5c237673c0625fb35a83f434336d9dc37ea5b9-merged.mount: Deactivated successfully.
Dec  1 04:24:51 np0005540741 podman[164697]: 2025-12-01 09:24:51.81396116 +0000 UTC m=+1.283493812 container remove 3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  1 04:24:51 np0005540741 systemd[1]: libpod-conmon-3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da.scope: Deactivated successfully.
Dec  1 04:24:51 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:24:51 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:24:51 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:24:51 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:24:51 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev ef520ca2-5572-4fa5-8487-4f03d920b2cc does not exist
Dec  1 04:24:51 np0005540741 python3.9[165023]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:24:52 np0005540741 python3.9[165238]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 04:24:52 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:24:52 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:24:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v388: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:53 np0005540741 python3.9[165390]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 04:24:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:24:53 np0005540741 systemd[1]: Reloading.
Dec  1 04:24:53 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:24:53 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:24:54 np0005540741 python3.9[165578]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:24:55 np0005540741 python3.9[165731]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:24:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v389: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:55 np0005540741 python3.9[165884]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:24:56 np0005540741 python3.9[166037]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:24:57 np0005540741 python3.9[166190]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:24:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v390: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:57 np0005540741 python3.9[166343]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:24:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:24:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v391: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:24:59 np0005540741 python3.9[166496]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:25:00 np0005540741 python3.9[166649]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec  1 04:25:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v392: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:01 np0005540741 python3.9[166802]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 04:25:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v393: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:03 np0005540741 python3.9[166960]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  1 04:25:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:25:03 np0005540741 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:25:03 np0005540741 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:25:04 np0005540741 python3.9[167121]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:25:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v394: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:05 np0005540741 python3.9[167205]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:25:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v395: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:25:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v396: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v397: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:25:12
Dec  1 04:25:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:25:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:25:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'cephfs.cephfs.data', 'images', 'volumes', 'vms', '.mgr', 'cephfs.cephfs.meta']
Dec  1 04:25:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:25:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v398: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:25:15 np0005540741 podman[167216]: 2025-12-01 09:25:15.099798878 +0000 UTC m=+0.179467688 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 04:25:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v399: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v400: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:25:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:25:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:25:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:25:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:25:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:25:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:25:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:25:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:25:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:25:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:25:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:25:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:25:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:25:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:25:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:25:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v401: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:20 np0005540741 podman[167242]: 2025-12-01 09:25:20.013429559 +0000 UTC m=+0.103560772 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  1 04:25:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:25:20.456 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:25:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:25:20.458 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:25:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:25:20.458 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:25:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v402: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v403: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:25:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v404: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v405: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:25:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v406: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v407: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v408: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:25:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v409: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v410: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:38 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:25:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v411: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v412: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:25:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:25:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:25:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:25:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:25:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:25:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v413: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:43 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:25:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v414: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:46 np0005540741 podman[167261]: 2025-12-01 09:25:46.000887899 +0000 UTC m=+0.102190326 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec  1 04:25:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v415: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:25:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v416: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:51 np0005540741 podman[167397]: 2025-12-01 09:25:51.000427579 +0000 UTC m=+0.093057853 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  1 04:25:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v417: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:52 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:25:52 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:25:52 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:25:52 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:25:52 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:25:52 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:25:52 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 21bb6ec5-5182-4dd5-bbd9-91f80a0b53fc does not exist
Dec  1 04:25:52 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 034303ee-3b5a-4ddb-8e9f-d0017971e52e does not exist
Dec  1 04:25:52 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 50088751-42b1-4bef-bae5-606d374c5981 does not exist
Dec  1 04:25:52 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:25:52 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:25:52 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:25:52 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:25:52 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:25:52 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:25:52 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:25:52 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:25:52 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:25:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v418: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:53 np0005540741 podman[167749]: 2025-12-01 09:25:53.445983603 +0000 UTC m=+0.037752503 container create aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_meninsky, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:25:53 np0005540741 systemd[1]: Started libpod-conmon-aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b.scope.
Dec  1 04:25:53 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:25:53 np0005540741 podman[167749]: 2025-12-01 09:25:53.523618138 +0000 UTC m=+0.115387058 container init aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_meninsky, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True)
Dec  1 04:25:53 np0005540741 podman[167749]: 2025-12-01 09:25:53.430102673 +0000 UTC m=+0.021871593 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:25:53 np0005540741 podman[167749]: 2025-12-01 09:25:53.531738273 +0000 UTC m=+0.123507173 container start aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_meninsky, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  1 04:25:53 np0005540741 podman[167749]: 2025-12-01 09:25:53.535243864 +0000 UTC m=+0.127012784 container attach aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_meninsky, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:25:53 np0005540741 festive_meninsky[167765]: 167 167
Dec  1 04:25:53 np0005540741 systemd[1]: libpod-aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b.scope: Deactivated successfully.
Dec  1 04:25:53 np0005540741 conmon[167765]: conmon aeb4c9b42d061b7849a7 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b.scope/container/memory.events
Dec  1 04:25:53 np0005540741 podman[167749]: 2025-12-01 09:25:53.538163119 +0000 UTC m=+0.129932029 container died aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_meninsky, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:25:53 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:25:53 np0005540741 systemd[1]: var-lib-containers-storage-overlay-30cdbac835516680dadd3061d3d227f72a445f657746992042900490d6e0481f-merged.mount: Deactivated successfully.
Dec  1 04:25:53 np0005540741 podman[167749]: 2025-12-01 09:25:53.641349583 +0000 UTC m=+0.233118513 container remove aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_meninsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:25:53 np0005540741 systemd[1]: libpod-conmon-aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b.scope: Deactivated successfully.
Dec  1 04:25:53 np0005540741 podman[167789]: 2025-12-01 09:25:53.895204346 +0000 UTC m=+0.111087084 container create 93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mirzakhani, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec  1 04:25:53 np0005540741 podman[167789]: 2025-12-01 09:25:53.812883395 +0000 UTC m=+0.028766153 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:25:53 np0005540741 systemd[1]: Started libpod-conmon-93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f.scope.
Dec  1 04:25:53 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:25:53 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94e4ad049bf3f3c465a5d7e940dcd7307c8da0580480d239edde753ba9ba3c32/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:25:53 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94e4ad049bf3f3c465a5d7e940dcd7307c8da0580480d239edde753ba9ba3c32/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:25:53 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94e4ad049bf3f3c465a5d7e940dcd7307c8da0580480d239edde753ba9ba3c32/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:25:53 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94e4ad049bf3f3c465a5d7e940dcd7307c8da0580480d239edde753ba9ba3c32/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:25:53 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94e4ad049bf3f3c465a5d7e940dcd7307c8da0580480d239edde753ba9ba3c32/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:25:53 np0005540741 podman[167789]: 2025-12-01 09:25:53.991409658 +0000 UTC m=+0.207292416 container init 93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mirzakhani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec  1 04:25:53 np0005540741 podman[167789]: 2025-12-01 09:25:53.997855405 +0000 UTC m=+0.213738143 container start 93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mirzakhani, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec  1 04:25:54 np0005540741 podman[167789]: 2025-12-01 09:25:54.000849261 +0000 UTC m=+0.216732019 container attach 93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mirzakhani, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:25:55 np0005540741 gifted_mirzakhani[167805]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:25:55 np0005540741 gifted_mirzakhani[167805]: --> relative data size: 1.0
Dec  1 04:25:55 np0005540741 gifted_mirzakhani[167805]: --> All data devices are unavailable
Dec  1 04:25:55 np0005540741 systemd[1]: libpod-93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f.scope: Deactivated successfully.
Dec  1 04:25:55 np0005540741 podman[167789]: 2025-12-01 09:25:55.061391876 +0000 UTC m=+1.277274624 container died 93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mirzakhani, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:25:55 np0005540741 systemd[1]: var-lib-containers-storage-overlay-94e4ad049bf3f3c465a5d7e940dcd7307c8da0580480d239edde753ba9ba3c32-merged.mount: Deactivated successfully.
Dec  1 04:25:55 np0005540741 podman[167789]: 2025-12-01 09:25:55.302198941 +0000 UTC m=+1.518081689 container remove 93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mirzakhani, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:25:55 np0005540741 systemd[1]: libpod-conmon-93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f.scope: Deactivated successfully.
Dec  1 04:25:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v419: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:55 np0005540741 podman[167986]: 2025-12-01 09:25:55.9196633 +0000 UTC m=+0.041445370 container create 8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_chandrasekhar, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  1 04:25:55 np0005540741 systemd[1]: Started libpod-conmon-8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167.scope.
Dec  1 04:25:55 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:25:55 np0005540741 podman[167986]: 2025-12-01 09:25:55.900199347 +0000 UTC m=+0.021981457 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:25:56 np0005540741 podman[167986]: 2025-12-01 09:25:56.007271644 +0000 UTC m=+0.129053734 container init 8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_chandrasekhar, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec  1 04:25:56 np0005540741 podman[167986]: 2025-12-01 09:25:56.01404482 +0000 UTC m=+0.135826890 container start 8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_chandrasekhar, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:25:56 np0005540741 awesome_chandrasekhar[168002]: 167 167
Dec  1 04:25:56 np0005540741 podman[167986]: 2025-12-01 09:25:56.017534821 +0000 UTC m=+0.139316921 container attach 8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_chandrasekhar, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:25:56 np0005540741 systemd[1]: libpod-8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167.scope: Deactivated successfully.
Dec  1 04:25:56 np0005540741 podman[167986]: 2025-12-01 09:25:56.020365753 +0000 UTC m=+0.142147853 container died 8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_chandrasekhar, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:25:56 np0005540741 systemd[1]: var-lib-containers-storage-overlay-8ccf423937655d59ed2d7f28e51d92fdd822d44310e62c3ce684e787631998c7-merged.mount: Deactivated successfully.
Dec  1 04:25:56 np0005540741 podman[167986]: 2025-12-01 09:25:56.05690912 +0000 UTC m=+0.178691190 container remove 8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_chandrasekhar, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:25:56 np0005540741 systemd[1]: libpod-conmon-8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167.scope: Deactivated successfully.
Dec  1 04:25:56 np0005540741 podman[168028]: 2025-12-01 09:25:56.253673071 +0000 UTC m=+0.063428916 container create cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_mahavira, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:25:56 np0005540741 systemd[1]: Started libpod-conmon-cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02.scope.
Dec  1 04:25:56 np0005540741 podman[168028]: 2025-12-01 09:25:56.233895269 +0000 UTC m=+0.043651094 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:25:56 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:25:56 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75da4ca3d8fd2c9c0f30ae86f73b2cb52eeb0746162362cde4d27f33d335fefb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:25:56 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75da4ca3d8fd2c9c0f30ae86f73b2cb52eeb0746162362cde4d27f33d335fefb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:25:56 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75da4ca3d8fd2c9c0f30ae86f73b2cb52eeb0746162362cde4d27f33d335fefb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:25:56 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75da4ca3d8fd2c9c0f30ae86f73b2cb52eeb0746162362cde4d27f33d335fefb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:25:56 np0005540741 podman[168028]: 2025-12-01 09:25:56.35493209 +0000 UTC m=+0.164687935 container init cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_mahavira, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec  1 04:25:56 np0005540741 podman[168028]: 2025-12-01 09:25:56.363842257 +0000 UTC m=+0.173598072 container start cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_mahavira, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  1 04:25:56 np0005540741 podman[168028]: 2025-12-01 09:25:56.366852644 +0000 UTC m=+0.176608539 container attach cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_mahavira, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]: {
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:    "0": [
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:        {
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "devices": [
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "/dev/loop3"
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            ],
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "lv_name": "ceph_lv0",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "lv_size": "21470642176",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "name": "ceph_lv0",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "tags": {
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.cluster_name": "ceph",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.crush_device_class": "",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.encrypted": "0",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.osd_id": "0",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.type": "block",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.vdo": "0"
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            },
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "type": "block",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "vg_name": "ceph_vg0"
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:        }
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:    ],
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:    "1": [
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:        {
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "devices": [
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "/dev/loop4"
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            ],
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "lv_name": "ceph_lv1",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "lv_size": "21470642176",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "name": "ceph_lv1",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "tags": {
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.cluster_name": "ceph",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.crush_device_class": "",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.encrypted": "0",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.osd_id": "1",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.type": "block",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.vdo": "0"
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            },
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "type": "block",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "vg_name": "ceph_vg1"
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:        }
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:    ],
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:    "2": [
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:        {
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "devices": [
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "/dev/loop5"
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            ],
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "lv_name": "ceph_lv2",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "lv_size": "21470642176",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "name": "ceph_lv2",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "tags": {
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.cluster_name": "ceph",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.crush_device_class": "",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.encrypted": "0",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.osd_id": "2",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.type": "block",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:                "ceph.vdo": "0"
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            },
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "type": "block",
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:            "vg_name": "ceph_vg2"
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:        }
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]:    ]
Dec  1 04:25:57 np0005540741 modest_mahavira[168044]: }
Dec  1 04:25:57 np0005540741 systemd[1]: libpod-cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02.scope: Deactivated successfully.
Dec  1 04:25:57 np0005540741 podman[168028]: 2025-12-01 09:25:57.12213042 +0000 UTC m=+0.931886225 container died cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_mahavira, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:25:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v420: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:57 np0005540741 systemd[1]: var-lib-containers-storage-overlay-75da4ca3d8fd2c9c0f30ae86f73b2cb52eeb0746162362cde4d27f33d335fefb-merged.mount: Deactivated successfully.
Dec  1 04:25:57 np0005540741 podman[168028]: 2025-12-01 09:25:57.387620509 +0000 UTC m=+1.197376324 container remove cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_mahavira, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:25:57 np0005540741 systemd[1]: libpod-conmon-cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02.scope: Deactivated successfully.
Dec  1 04:25:58 np0005540741 podman[168208]: 2025-12-01 09:25:58.028930818 +0000 UTC m=+0.048065781 container create e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_northcutt, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec  1 04:25:58 np0005540741 systemd[1]: Started libpod-conmon-e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c.scope.
Dec  1 04:25:58 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:25:58 np0005540741 podman[168208]: 2025-12-01 09:25:58.008429285 +0000 UTC m=+0.027564278 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:25:58 np0005540741 podman[168208]: 2025-12-01 09:25:58.112370451 +0000 UTC m=+0.131505434 container init e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_northcutt, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec  1 04:25:58 np0005540741 podman[168208]: 2025-12-01 09:25:58.118999603 +0000 UTC m=+0.138134606 container start e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_northcutt, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec  1 04:25:58 np0005540741 podman[168208]: 2025-12-01 09:25:58.123442342 +0000 UTC m=+0.142577325 container attach e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_northcutt, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:25:58 np0005540741 heuristic_northcutt[168226]: 167 167
Dec  1 04:25:58 np0005540741 systemd[1]: libpod-e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c.scope: Deactivated successfully.
Dec  1 04:25:58 np0005540741 podman[168208]: 2025-12-01 09:25:58.125980945 +0000 UTC m=+0.145115988 container died e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_northcutt, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:25:58 np0005540741 systemd[1]: var-lib-containers-storage-overlay-5e0e0cdc0576956bfe5e149fc42e6c3b1fc283e1069f18e8cc30f68e0fa08ba1-merged.mount: Deactivated successfully.
Dec  1 04:25:58 np0005540741 podman[168208]: 2025-12-01 09:25:58.180991516 +0000 UTC m=+0.200126479 container remove e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_northcutt, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec  1 04:25:58 np0005540741 systemd[1]: libpod-conmon-e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c.scope: Deactivated successfully.
Dec  1 04:25:58 np0005540741 podman[168252]: 2025-12-01 09:25:58.360229461 +0000 UTC m=+0.051863372 container create 906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_sutherland, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec  1 04:25:58 np0005540741 systemd[1]: Started libpod-conmon-906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165.scope.
Dec  1 04:25:58 np0005540741 podman[168252]: 2025-12-01 09:25:58.334092015 +0000 UTC m=+0.025725956 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:25:58 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:25:58 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc596e7b0cf8f8b470c9654b6e318c4e5ef2ea1d60dd0c4bd54ffaa77a715665/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:25:58 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc596e7b0cf8f8b470c9654b6e318c4e5ef2ea1d60dd0c4bd54ffaa77a715665/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:25:58 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc596e7b0cf8f8b470c9654b6e318c4e5ef2ea1d60dd0c4bd54ffaa77a715665/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:25:58 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc596e7b0cf8f8b470c9654b6e318c4e5ef2ea1d60dd0c4bd54ffaa77a715665/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:25:58 np0005540741 podman[168252]: 2025-12-01 09:25:58.447047452 +0000 UTC m=+0.138681383 container init 906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_sutherland, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec  1 04:25:58 np0005540741 podman[168252]: 2025-12-01 09:25:58.456149115 +0000 UTC m=+0.147783016 container start 906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_sutherland, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec  1 04:25:58 np0005540741 podman[168252]: 2025-12-01 09:25:58.461586422 +0000 UTC m=+0.153220353 container attach 906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_sutherland, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Dec  1 04:25:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:25:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v421: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]: {
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:        "osd_id": 0,
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:        "type": "bluestore"
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:    },
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:        "osd_id": 1,
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:        "type": "bluestore"
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:    },
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:        "osd_id": 2,
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:        "type": "bluestore"
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]:    }
Dec  1 04:25:59 np0005540741 blissful_sutherland[168268]: }
Dec  1 04:25:59 np0005540741 systemd[1]: libpod-906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165.scope: Deactivated successfully.
Dec  1 04:25:59 np0005540741 podman[168252]: 2025-12-01 09:25:59.50841763 +0000 UTC m=+1.200051541 container died 906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_sutherland, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:25:59 np0005540741 systemd[1]: libpod-906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165.scope: Consumed 1.047s CPU time.
Dec  1 04:25:59 np0005540741 systemd[1]: var-lib-containers-storage-overlay-dc596e7b0cf8f8b470c9654b6e318c4e5ef2ea1d60dd0c4bd54ffaa77a715665-merged.mount: Deactivated successfully.
Dec  1 04:25:59 np0005540741 podman[168252]: 2025-12-01 09:25:59.567776067 +0000 UTC m=+1.259409958 container remove 906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_sutherland, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:25:59 np0005540741 systemd[1]: libpod-conmon-906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165.scope: Deactivated successfully.
Dec  1 04:25:59 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:25:59 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:25:59 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:25:59 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:25:59 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 1f41a2df-dd36-42b5-8782-cff7b110fc0c does not exist
Dec  1 04:26:00 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:26:00 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:26:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v422: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v423: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:26:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v424: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v425: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:08 np0005540741 kernel: SELinux:  Converting 2768 SID table entries...
Dec  1 04:26:08 np0005540741 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:26:08 np0005540741 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:26:08 np0005540741 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:26:08 np0005540741 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:26:08 np0005540741 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:26:08 np0005540741 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:26:08 np0005540741 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:26:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:26:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v426: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v427: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:26:12
Dec  1 04:26:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:26:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:26:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'vms', 'cephfs.cephfs.data', 'images', 'cephfs.cephfs.meta', '.mgr', 'volumes']
Dec  1 04:26:12 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:26:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v428: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:26:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v429: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:16 np0005540741 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Dec  1 04:26:17 np0005540741 podman[168373]: 2025-12-01 09:26:17.024107627 +0000 UTC m=+0.106891622 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  1 04:26:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v430: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:18 np0005540741 kernel: SELinux:  Converting 2768 SID table entries...
Dec  1 04:26:18 np0005540741 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:26:18 np0005540741 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:26:18 np0005540741 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:26:18 np0005540741 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:26:18 np0005540741 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:26:18 np0005540741 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:26:18 np0005540741 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:26:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:26:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:26:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:26:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:26:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:26:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:26:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:26:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:26:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:26:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:26:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:26:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:26:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:26:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:26:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:26:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:26:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v431: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:26:20.458 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:26:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:26:20.460 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:26:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:26:20.461 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:26:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v432: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:21 np0005540741 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec  1 04:26:21 np0005540741 podman[168406]: 2025-12-01 09:26:21.979442674 +0000 UTC m=+0.078445510 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:26:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v433: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:26:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v434: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v435: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:26:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v436: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v437: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v438: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:26:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v439: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v440: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:26:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v441: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v442: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:26:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:26:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:26:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:26:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:26:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:26:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v443: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:26:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v444: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v445: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:48 np0005540741 podman[179345]: 2025-12-01 09:26:48.005635473 +0000 UTC m=+0.103195745 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 04:26:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:26:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v446: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v447: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:52 np0005540741 podman[182586]: 2025-12-01 09:26:52.994099237 +0000 UTC m=+0.084049842 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  1 04:26:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v448: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:54 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:26:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v449: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v450: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:26:59 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:26:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v451: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:00 np0005540741 podman[185436]: 2025-12-01 09:27:00.456479476 +0000 UTC m=+0.066412082 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  1 04:27:00 np0005540741 podman[185436]: 2025-12-01 09:27:00.565620743 +0000 UTC m=+0.175553359 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:27:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v452: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:27:01 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 29faa56f-d14b-410f-9fb2-68c3bc466846 does not exist
Dec  1 04:27:01 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 7a9dc25b-66b8-4eec-840c-b4ca185931e2 does not exist
Dec  1 04:27:01 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 2d3bfa0d-99f2-46eb-9626-708bd8cde677 does not exist
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:27:01 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:27:02 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:27:02 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:27:02 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:27:02 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:27:02 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:27:02 np0005540741 podman[185854]: 2025-12-01 09:27:02.351021314 +0000 UTC m=+0.045310402 container create 1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_engelbart, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:27:02 np0005540741 systemd[1]: Started libpod-conmon-1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc.scope.
Dec  1 04:27:02 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:27:02 np0005540741 podman[185854]: 2025-12-01 09:27:02.330334995 +0000 UTC m=+0.024624173 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:27:02 np0005540741 podman[185854]: 2025-12-01 09:27:02.437789973 +0000 UTC m=+0.132079091 container init 1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_engelbart, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:27:02 np0005540741 podman[185854]: 2025-12-01 09:27:02.446729762 +0000 UTC m=+0.141018860 container start 1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_engelbart, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec  1 04:27:02 np0005540741 podman[185854]: 2025-12-01 09:27:02.450636205 +0000 UTC m=+0.144925303 container attach 1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_engelbart, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec  1 04:27:02 np0005540741 priceless_engelbart[185871]: 167 167
Dec  1 04:27:02 np0005540741 systemd[1]: libpod-1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc.scope: Deactivated successfully.
Dec  1 04:27:02 np0005540741 podman[185854]: 2025-12-01 09:27:02.463153887 +0000 UTC m=+0.157442985 container died 1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_engelbart, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:27:02 np0005540741 systemd[1]: var-lib-containers-storage-overlay-0f6a05b6348241c82bec9aa6285ca3f4791a4195c81e0ade4b11b037687d1b96-merged.mount: Deactivated successfully.
Dec  1 04:27:02 np0005540741 podman[185854]: 2025-12-01 09:27:02.502744502 +0000 UTC m=+0.197033600 container remove 1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_engelbart, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3)
Dec  1 04:27:02 np0005540741 systemd[1]: libpod-conmon-1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc.scope: Deactivated successfully.
Dec  1 04:27:02 np0005540741 podman[185895]: 2025-12-01 09:27:02.673206532 +0000 UTC m=+0.041488251 container create 6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_lalande, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 04:27:02 np0005540741 systemd[1]: Started libpod-conmon-6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b.scope.
Dec  1 04:27:02 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:27:02 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/917b0b7a1b44b3901c5258f8f72eaabbc0a11c2760c493258442fa9ec47b1ad3/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:27:02 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/917b0b7a1b44b3901c5258f8f72eaabbc0a11c2760c493258442fa9ec47b1ad3/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:27:02 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/917b0b7a1b44b3901c5258f8f72eaabbc0a11c2760c493258442fa9ec47b1ad3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:27:02 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/917b0b7a1b44b3901c5258f8f72eaabbc0a11c2760c493258442fa9ec47b1ad3/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:27:02 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/917b0b7a1b44b3901c5258f8f72eaabbc0a11c2760c493258442fa9ec47b1ad3/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:27:02 np0005540741 podman[185895]: 2025-12-01 09:27:02.657180859 +0000 UTC m=+0.025462598 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:27:02 np0005540741 podman[185895]: 2025-12-01 09:27:02.7678537 +0000 UTC m=+0.136135449 container init 6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_lalande, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:27:02 np0005540741 podman[185895]: 2025-12-01 09:27:02.776876071 +0000 UTC m=+0.145157790 container start 6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_lalande, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  1 04:27:02 np0005540741 podman[185895]: 2025-12-01 09:27:02.780444594 +0000 UTC m=+0.148726323 container attach 6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_lalande, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3)
Dec  1 04:27:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v453: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:03 np0005540741 strange_lalande[185913]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:27:03 np0005540741 strange_lalande[185913]: --> relative data size: 1.0
Dec  1 04:27:03 np0005540741 strange_lalande[185913]: --> All data devices are unavailable
Dec  1 04:27:03 np0005540741 systemd[1]: libpod-6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b.scope: Deactivated successfully.
Dec  1 04:27:03 np0005540741 podman[185895]: 2025-12-01 09:27:03.868908676 +0000 UTC m=+1.237190395 container died 6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_lalande, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec  1 04:27:03 np0005540741 systemd[1]: var-lib-containers-storage-overlay-917b0b7a1b44b3901c5258f8f72eaabbc0a11c2760c493258442fa9ec47b1ad3-merged.mount: Deactivated successfully.
Dec  1 04:27:03 np0005540741 podman[185895]: 2025-12-01 09:27:03.916794861 +0000 UTC m=+1.285076580 container remove 6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_lalande, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec  1 04:27:03 np0005540741 systemd[1]: libpod-conmon-6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b.scope: Deactivated successfully.
Dec  1 04:27:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:27:04 np0005540741 podman[186096]: 2025-12-01 09:27:04.612525024 +0000 UTC m=+0.044265571 container create 5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_mirzakhani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:27:04 np0005540741 systemd[1]: Started libpod-conmon-5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a.scope.
Dec  1 04:27:04 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:27:04 np0005540741 podman[186096]: 2025-12-01 09:27:04.675367062 +0000 UTC m=+0.107107619 container init 5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_mirzakhani, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:27:04 np0005540741 podman[186096]: 2025-12-01 09:27:04.68118433 +0000 UTC m=+0.112924877 container start 5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_mirzakhani, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:27:04 np0005540741 podman[186096]: 2025-12-01 09:27:04.684438564 +0000 UTC m=+0.116179111 container attach 5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_mirzakhani, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec  1 04:27:04 np0005540741 practical_mirzakhani[186112]: 167 167
Dec  1 04:27:04 np0005540741 podman[186096]: 2025-12-01 09:27:04.58957807 +0000 UTC m=+0.021318637 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:27:04 np0005540741 systemd[1]: libpod-5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a.scope: Deactivated successfully.
Dec  1 04:27:04 np0005540741 podman[186096]: 2025-12-01 09:27:04.68773948 +0000 UTC m=+0.119480037 container died 5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_mirzakhani, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec  1 04:27:04 np0005540741 systemd[1]: var-lib-containers-storage-overlay-7e265f55a7e5e888b92e8055db8127a08e2cdcef43cce6fbb9c404baa884e63b-merged.mount: Deactivated successfully.
Dec  1 04:27:04 np0005540741 podman[186096]: 2025-12-01 09:27:04.724551834 +0000 UTC m=+0.156292381 container remove 5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_mirzakhani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:27:04 np0005540741 systemd[1]: libpod-conmon-5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a.scope: Deactivated successfully.
Dec  1 04:27:04 np0005540741 podman[186135]: 2025-12-01 09:27:04.885199861 +0000 UTC m=+0.043363535 container create 9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hodgkin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:27:04 np0005540741 systemd[1]: Started libpod-conmon-9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764.scope.
Dec  1 04:27:04 np0005540741 podman[186135]: 2025-12-01 09:27:04.86579338 +0000 UTC m=+0.023957074 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:27:04 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:27:04 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f415c2420a5d0d5217afe86c70c2d2f7d50061b932649db1b4578bc1a581f03b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:27:04 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f415c2420a5d0d5217afe86c70c2d2f7d50061b932649db1b4578bc1a581f03b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:27:04 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f415c2420a5d0d5217afe86c70c2d2f7d50061b932649db1b4578bc1a581f03b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:27:04 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f415c2420a5d0d5217afe86c70c2d2f7d50061b932649db1b4578bc1a581f03b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:27:05 np0005540741 podman[186135]: 2025-12-01 09:27:05.074881367 +0000 UTC m=+0.233045091 container init 9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hodgkin, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec  1 04:27:05 np0005540741 podman[186135]: 2025-12-01 09:27:05.082680693 +0000 UTC m=+0.240844377 container start 9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hodgkin, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec  1 04:27:05 np0005540741 podman[186135]: 2025-12-01 09:27:05.132824813 +0000 UTC m=+0.290988487 container attach 9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hodgkin, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:27:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v454: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]: {
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:    "0": [
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:        {
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "devices": [
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "/dev/loop3"
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            ],
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "lv_name": "ceph_lv0",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "lv_size": "21470642176",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "name": "ceph_lv0",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "tags": {
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.cluster_name": "ceph",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.crush_device_class": "",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.encrypted": "0",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.osd_id": "0",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.type": "block",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.vdo": "0"
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            },
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "type": "block",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "vg_name": "ceph_vg0"
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:        }
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:    ],
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:    "1": [
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:        {
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "devices": [
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "/dev/loop4"
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            ],
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "lv_name": "ceph_lv1",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "lv_size": "21470642176",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "name": "ceph_lv1",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "tags": {
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.cluster_name": "ceph",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.crush_device_class": "",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.encrypted": "0",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.osd_id": "1",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.type": "block",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.vdo": "0"
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            },
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "type": "block",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "vg_name": "ceph_vg1"
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:        }
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:    ],
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:    "2": [
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:        {
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "devices": [
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "/dev/loop5"
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            ],
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "lv_name": "ceph_lv2",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "lv_size": "21470642176",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "name": "ceph_lv2",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "tags": {
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.cluster_name": "ceph",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.crush_device_class": "",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.encrypted": "0",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.osd_id": "2",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.type": "block",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:                "ceph.vdo": "0"
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            },
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "type": "block",
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:            "vg_name": "ceph_vg2"
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:        }
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]:    ]
Dec  1 04:27:05 np0005540741 suspicious_hodgkin[186151]: }
Dec  1 04:27:05 np0005540741 systemd[1]: libpod-9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764.scope: Deactivated successfully.
Dec  1 04:27:05 np0005540741 podman[186160]: 2025-12-01 09:27:05.898896231 +0000 UTC m=+0.022622495 container died 9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hodgkin, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:27:05 np0005540741 systemd[1]: var-lib-containers-storage-overlay-f415c2420a5d0d5217afe86c70c2d2f7d50061b932649db1b4578bc1a581f03b-merged.mount: Deactivated successfully.
Dec  1 04:27:05 np0005540741 podman[186160]: 2025-12-01 09:27:05.95139792 +0000 UTC m=+0.075124154 container remove 9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hodgkin, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:27:05 np0005540741 systemd[1]: libpod-conmon-9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764.scope: Deactivated successfully.
Dec  1 04:27:06 np0005540741 podman[186312]: 2025-12-01 09:27:06.615412815 +0000 UTC m=+0.042661475 container create 45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_allen, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  1 04:27:06 np0005540741 systemd[1]: Started libpod-conmon-45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7.scope.
Dec  1 04:27:06 np0005540741 podman[186312]: 2025-12-01 09:27:06.596036425 +0000 UTC m=+0.023285085 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:27:06 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:27:06 np0005540741 podman[186312]: 2025-12-01 09:27:06.751583664 +0000 UTC m=+0.178832394 container init 45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_allen, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 04:27:06 np0005540741 podman[186312]: 2025-12-01 09:27:06.758854644 +0000 UTC m=+0.186103284 container start 45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_allen, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:27:06 np0005540741 bold_allen[186330]: 167 167
Dec  1 04:27:06 np0005540741 systemd[1]: libpod-45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7.scope: Deactivated successfully.
Dec  1 04:27:06 np0005540741 podman[186312]: 2025-12-01 09:27:06.774126986 +0000 UTC m=+0.201375676 container attach 45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_allen, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec  1 04:27:06 np0005540741 podman[186312]: 2025-12-01 09:27:06.774564969 +0000 UTC m=+0.201813619 container died 45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_allen, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec  1 04:27:06 np0005540741 systemd[1]: var-lib-containers-storage-overlay-2a1baba79d5f0ff6b8549bcb836717ea9ff79ac964811f1e515784246bae9f70-merged.mount: Deactivated successfully.
Dec  1 04:27:06 np0005540741 podman[186312]: 2025-12-01 09:27:06.828712234 +0000 UTC m=+0.255960904 container remove 45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_allen, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec  1 04:27:06 np0005540741 systemd[1]: libpod-conmon-45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7.scope: Deactivated successfully.
Dec  1 04:27:07 np0005540741 podman[186354]: 2025-12-01 09:27:07.033168777 +0000 UTC m=+0.031026328 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:27:07 np0005540741 podman[186354]: 2025-12-01 09:27:07.170427778 +0000 UTC m=+0.168285259 container create c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_gould, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec  1 04:27:07 np0005540741 systemd[1]: Started libpod-conmon-c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a.scope.
Dec  1 04:27:07 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:27:07 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4ecbed6342a885750bd179fd6e98adf5bc756f01e689bc6b91df3dcc39b24de/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:27:07 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4ecbed6342a885750bd179fd6e98adf5bc756f01e689bc6b91df3dcc39b24de/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:27:07 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4ecbed6342a885750bd179fd6e98adf5bc756f01e689bc6b91df3dcc39b24de/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:27:07 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4ecbed6342a885750bd179fd6e98adf5bc756f01e689bc6b91df3dcc39b24de/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:27:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v455: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:07 np0005540741 podman[186354]: 2025-12-01 09:27:07.402081618 +0000 UTC m=+0.399939139 container init c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_gould, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec  1 04:27:07 np0005540741 podman[186354]: 2025-12-01 09:27:07.409343638 +0000 UTC m=+0.407201159 container start c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_gould, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec  1 04:27:07 np0005540741 podman[186354]: 2025-12-01 09:27:07.41427363 +0000 UTC m=+0.412131161 container attach c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_gould, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:27:08 np0005540741 frosty_gould[186370]: {
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:        "osd_id": 0,
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:        "type": "bluestore"
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:    },
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:        "osd_id": 1,
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:        "type": "bluestore"
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:    },
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:        "osd_id": 2,
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:        "type": "bluestore"
Dec  1 04:27:08 np0005540741 frosty_gould[186370]:    }
Dec  1 04:27:08 np0005540741 frosty_gould[186370]: }
Dec  1 04:27:08 np0005540741 systemd[1]: libpod-c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a.scope: Deactivated successfully.
Dec  1 04:27:08 np0005540741 podman[186354]: 2025-12-01 09:27:08.575719484 +0000 UTC m=+1.573576965 container died c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_gould, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  1 04:27:08 np0005540741 systemd[1]: libpod-c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a.scope: Consumed 1.168s CPU time.
Dec  1 04:27:08 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b4ecbed6342a885750bd179fd6e98adf5bc756f01e689bc6b91df3dcc39b24de-merged.mount: Deactivated successfully.
Dec  1 04:27:08 np0005540741 podman[186354]: 2025-12-01 09:27:08.735509976 +0000 UTC m=+1.733367477 container remove c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_gould, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:27:08 np0005540741 systemd[1]: libpod-conmon-c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a.scope: Deactivated successfully.
Dec  1 04:27:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:27:08 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:27:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:27:08 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:27:08 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 84ff2688-aca2-48d2-a896-5341958d83c0 does not exist
Dec  1 04:27:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:27:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v456: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:09 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:27:09 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:27:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v457: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:11 np0005540741 kernel: SELinux:  Converting 2769 SID table entries...
Dec  1 04:27:11 np0005540741 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:27:11 np0005540741 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:27:11 np0005540741 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:27:11 np0005540741 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:27:11 np0005540741 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:27:11 np0005540741 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:27:11 np0005540741 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:27:12
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['vms', '.mgr', 'volumes', 'backups', 'images', 'cephfs.cephfs.data', 'cephfs.cephfs.meta']
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:27:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v458: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:13 np0005540741 dbus-broker-launch[757]: Noticed file-system modification, trigger reload.
Dec  1 04:27:13 np0005540741 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec  1 04:27:13 np0005540741 dbus-broker-launch[757]: Noticed file-system modification, trigger reload.
Dec  1 04:27:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:27:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v459: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v460: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:27:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:27:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:27:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:27:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:27:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:27:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:27:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:27:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:27:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:27:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:27:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:27:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:27:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:27:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:27:19 np0005540741 podman[186714]: 2025-12-01 09:27:19.008758534 +0000 UTC m=+0.101705632 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  1 04:27:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:27:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v461: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:27:20.460 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:27:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:27:20.461 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:27:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:27:20.461 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:27:20 np0005540741 systemd[1]: Stopping OpenSSH server daemon...
Dec  1 04:27:20 np0005540741 systemd[1]: sshd.service: Deactivated successfully.
Dec  1 04:27:20 np0005540741 systemd[1]: Stopped OpenSSH server daemon.
Dec  1 04:27:20 np0005540741 systemd[1]: sshd.service: Consumed 2.319s CPU time, read 32.0K from disk, written 0B to disk.
Dec  1 04:27:20 np0005540741 systemd[1]: Stopped target sshd-keygen.target.
Dec  1 04:27:20 np0005540741 systemd[1]: Stopping sshd-keygen.target...
Dec  1 04:27:20 np0005540741 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 04:27:20 np0005540741 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 04:27:20 np0005540741 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 04:27:20 np0005540741 systemd[1]: Reached target sshd-keygen.target.
Dec  1 04:27:20 np0005540741 systemd[1]: Starting OpenSSH server daemon...
Dec  1 04:27:20 np0005540741 systemd[1]: Started OpenSSH server daemon.
Dec  1 04:27:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v462: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:22 np0005540741 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:27:22 np0005540741 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:27:23 np0005540741 systemd[1]: Reloading.
Dec  1 04:27:23 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:27:23 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:27:23 np0005540741 podman[187581]: 2025-12-01 09:27:23.200093773 +0000 UTC m=+0.105952186 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  1 04:27:23 np0005540741 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:27:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v463: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:27:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v464: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:26 np0005540741 python3.9[191850]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 04:27:26 np0005540741 systemd[1]: Reloading.
Dec  1 04:27:27 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:27:27 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:27:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v465: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:28 np0005540741 python3.9[193233]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 04:27:28 np0005540741 systemd[1]: Reloading.
Dec  1 04:27:28 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:27:28 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:27:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:27:29 np0005540741 python3.9[194639]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 04:27:29 np0005540741 systemd[1]: Reloading.
Dec  1 04:27:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v466: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:29 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:27:29 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:27:30 np0005540741 python3.9[195998]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 04:27:30 np0005540741 systemd[1]: Reloading.
Dec  1 04:27:30 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:27:30 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:27:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v467: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:31 np0005540741 python3.9[196790]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:31 np0005540741 systemd[1]: Reloading.
Dec  1 04:27:31 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:27:31 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:27:32 np0005540741 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:27:32 np0005540741 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:27:32 np0005540741 systemd[1]: man-db-cache-update.service: Consumed 9.972s CPU time.
Dec  1 04:27:32 np0005540741 systemd[1]: run-r9e0e46fc0b554e62bf094c5b2e064c83.service: Deactivated successfully.
Dec  1 04:27:32 np0005540741 python3.9[197097]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:32 np0005540741 systemd[1]: Reloading.
Dec  1 04:27:32 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:27:32 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:27:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v468: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:33 np0005540741 python3.9[197288]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:34 np0005540741 systemd[1]: Reloading.
Dec  1 04:27:34 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:27:34 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:27:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:27:35 np0005540741 python3.9[197478]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v469: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:35 np0005540741 python3.9[197633]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:35 np0005540741 systemd[1]: Reloading.
Dec  1 04:27:36 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:27:36 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:27:37 np0005540741 python3.9[197823]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 04:27:37 np0005540741 systemd[1]: Reloading.
Dec  1 04:27:37 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:27:37 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:27:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v470: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:37 np0005540741 systemd[1]: Listening on libvirt proxy daemon socket.
Dec  1 04:27:37 np0005540741 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec  1 04:27:38 np0005540741 python3.9[198016]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:27:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v471: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:39 np0005540741 python3.9[198171]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:40 np0005540741 python3.9[198326]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:41 np0005540741 python3.9[198481]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v472: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:42 np0005540741 python3.9[198636]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:27:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:27:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:27:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:27:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:27:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:27:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v473: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:27:44 np0005540741 python3.9[198791]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v474: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:45 np0005540741 python3.9[198946]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:46 np0005540741 python3.9[199101]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v475: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:47 np0005540741 python3.9[199256]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:48 np0005540741 python3.9[199411]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:49 np0005540741 python3.9[199566]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:49 np0005540741 podman[199568]: 2025-12-01 09:27:49.225270495 +0000 UTC m=+0.132507704 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:27:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:27:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v476: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:49 np0005540741 python3.9[199746]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:50 np0005540741 python3.9[199901]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v477: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:51 np0005540741 python3.9[200056]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 04:27:52 np0005540741 python3.9[200211]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:27:52 np0005540741 auditd[702]: Audit daemon rotating log files
Dec  1 04:27:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v478: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:53 np0005540741 python3.9[200363]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:27:53 np0005540741 podman[200487]: 2025-12-01 09:27:53.885217816 +0000 UTC m=+0.052366586 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec  1 04:27:54 np0005540741 python3.9[200535]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:27:54 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:27:54 np0005540741 python3.9[200687]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:27:55 np0005540741 python3.9[200839]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:27:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v479: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:55 np0005540741 python3.9[200991]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:27:56 np0005540741 python3.9[201143]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:27:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v480: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:27:57 np0005540741 python3.9[201268]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581276.1379938-554-175687639951767/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:27:58 np0005540741 python3.9[201420]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:27:59 np0005540741 python3.9[201545]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581278.1158066-554-224690296856351/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:27:59 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:27:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v481: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:00 np0005540741 python3.9[201697]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:00 np0005540741 python3.9[201822]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581279.4553518-554-250676472119938/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:01 np0005540741 python3.9[201974]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v482: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:01 np0005540741 python3.9[202099]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581280.9123046-554-74411250698731/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:02 np0005540741 python3.9[202251]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:03 np0005540741 python3.9[202376]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581282.080971-554-32897695285985/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v483: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:03 np0005540741 python3.9[202528]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:28:04 np0005540741 python3.9[202653]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581283.4543128-554-235847549705753/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:05 np0005540741 python3.9[202805]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v484: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:05 np0005540741 python3.9[202928]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581284.744884-554-236708452641224/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:06 np0005540741 python3.9[203080]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:06 np0005540741 python3.9[203205]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581285.8917146-554-149804307004937/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v485: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:07 np0005540741 python3.9[203357]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec  1 04:28:08 np0005540741 python3.9[203510]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:09 np0005540741 python3.9[203662]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:28:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v486: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:28:09 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:28:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:28:09 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:28:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:28:09 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:28:09 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 22bc7996-b20c-4bf2-a779-bbbb8b5ae20e does not exist
Dec  1 04:28:09 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 1b5ff66e-45e3-46d8-93d9-290c8001a41d does not exist
Dec  1 04:28:09 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 80aa6222-f655-4b0e-8a05-bd889f7d92db does not exist
Dec  1 04:28:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:28:09 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:28:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:28:09 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:28:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:28:09 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:28:09 np0005540741 python3.9[203931]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:10 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:28:10 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:28:10 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:28:10 np0005540741 podman[204238]: 2025-12-01 09:28:10.385916537 +0000 UTC m=+0.035851348 container create fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_bose, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  1 04:28:10 np0005540741 python3.9[204210]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:10 np0005540741 systemd[1]: Started libpod-conmon-fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310.scope.
Dec  1 04:28:10 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:28:10 np0005540741 podman[204238]: 2025-12-01 09:28:10.369163832 +0000 UTC m=+0.019098673 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:28:10 np0005540741 podman[204238]: 2025-12-01 09:28:10.47592106 +0000 UTC m=+0.125855891 container init fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_bose, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS)
Dec  1 04:28:10 np0005540741 podman[204238]: 2025-12-01 09:28:10.484487298 +0000 UTC m=+0.134422109 container start fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_bose, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:28:10 np0005540741 podman[204238]: 2025-12-01 09:28:10.487715781 +0000 UTC m=+0.137650592 container attach fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_bose, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:28:10 np0005540741 systemd[1]: libpod-fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310.scope: Deactivated successfully.
Dec  1 04:28:10 np0005540741 suspicious_bose[204254]: 167 167
Dec  1 04:28:10 np0005540741 podman[204238]: 2025-12-01 09:28:10.493629852 +0000 UTC m=+0.143564663 container died fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_bose, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3)
Dec  1 04:28:10 np0005540741 conmon[204254]: conmon fe8d54eb3a7af097d222 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310.scope/container/memory.events
Dec  1 04:28:10 np0005540741 systemd[1]: var-lib-containers-storage-overlay-49f40e1095c32e4bdb0ccda9fd970639d402e948b46fb239af8b947b9d848b6d-merged.mount: Deactivated successfully.
Dec  1 04:28:10 np0005540741 podman[204238]: 2025-12-01 09:28:10.540375444 +0000 UTC m=+0.190310255 container remove fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_bose, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:28:10 np0005540741 systemd[1]: libpod-conmon-fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310.scope: Deactivated successfully.
Dec  1 04:28:10 np0005540741 podman[204354]: 2025-12-01 09:28:10.700763153 +0000 UTC m=+0.046434524 container create d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:28:10 np0005540741 systemd[1]: Started libpod-conmon-d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1.scope.
Dec  1 04:28:10 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:28:10 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f80134766c796e2390f7bcac3fad2168cd516a6c3d418c3f3e08ace97809e94/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:28:10 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f80134766c796e2390f7bcac3fad2168cd516a6c3d418c3f3e08ace97809e94/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:28:10 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f80134766c796e2390f7bcac3fad2168cd516a6c3d418c3f3e08ace97809e94/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:28:10 np0005540741 podman[204354]: 2025-12-01 09:28:10.677085068 +0000 UTC m=+0.022756479 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:28:10 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f80134766c796e2390f7bcac3fad2168cd516a6c3d418c3f3e08ace97809e94/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:28:10 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f80134766c796e2390f7bcac3fad2168cd516a6c3d418c3f3e08ace97809e94/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:28:10 np0005540741 podman[204354]: 2025-12-01 09:28:10.782508888 +0000 UTC m=+0.128180279 container init d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:28:10 np0005540741 podman[204354]: 2025-12-01 09:28:10.791785426 +0000 UTC m=+0.137456817 container start d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:28:10 np0005540741 podman[204354]: 2025-12-01 09:28:10.795210695 +0000 UTC m=+0.140882076 container attach d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec  1 04:28:11 np0005540741 python3.9[204450]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v487: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:11 np0005540741 python3.9[204605]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:11 np0005540741 sweet_jennings[204393]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:28:11 np0005540741 sweet_jennings[204393]: --> relative data size: 1.0
Dec  1 04:28:11 np0005540741 sweet_jennings[204393]: --> All data devices are unavailable
Dec  1 04:28:11 np0005540741 systemd[1]: libpod-d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1.scope: Deactivated successfully.
Dec  1 04:28:11 np0005540741 podman[204354]: 2025-12-01 09:28:11.813099675 +0000 UTC m=+1.158771106 container died d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:28:11 np0005540741 systemd[1]: var-lib-containers-storage-overlay-9f80134766c796e2390f7bcac3fad2168cd516a6c3d418c3f3e08ace97809e94-merged.mount: Deactivated successfully.
Dec  1 04:28:11 np0005540741 podman[204354]: 2025-12-01 09:28:11.878030463 +0000 UTC m=+1.223701824 container remove d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:28:11 np0005540741 systemd[1]: libpod-conmon-d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1.scope: Deactivated successfully.
Dec  1 04:28:12 np0005540741 python3.9[204870]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:12 np0005540741 podman[204982]: 2025-12-01 09:28:12.495112522 +0000 UTC m=+0.039012090 container create 29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_shirley, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:28:12 np0005540741 systemd[1]: Started libpod-conmon-29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0.scope.
Dec  1 04:28:12 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:28:12 np0005540741 podman[204982]: 2025-12-01 09:28:12.572894181 +0000 UTC m=+0.116793769 container init 29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_shirley, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:28:12 np0005540741 podman[204982]: 2025-12-01 09:28:12.478874542 +0000 UTC m=+0.022774130 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:28:12 np0005540741 podman[204982]: 2025-12-01 09:28:12.581995335 +0000 UTC m=+0.125894903 container start 29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_shirley, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:28:12 np0005540741 podman[204982]: 2025-12-01 09:28:12.585183577 +0000 UTC m=+0.129083145 container attach 29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_shirley, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:28:12 np0005540741 determined_shirley[205035]: 167 167
Dec  1 04:28:12 np0005540741 systemd[1]: libpod-29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0.scope: Deactivated successfully.
Dec  1 04:28:12 np0005540741 podman[204982]: 2025-12-01 09:28:12.586897856 +0000 UTC m=+0.130797424 container died 29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_shirley, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec  1 04:28:12 np0005540741 systemd[1]: var-lib-containers-storage-overlay-be7ad9939f75d794f82579afd3618c586362b2647d5e8aed93676927d3927f83-merged.mount: Deactivated successfully.
Dec  1 04:28:12 np0005540741 podman[204982]: 2025-12-01 09:28:12.621356303 +0000 UTC m=+0.165255871 container remove 29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_shirley, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec  1 04:28:12 np0005540741 systemd[1]: libpod-conmon-29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0.scope: Deactivated successfully.
Dec  1 04:28:12 np0005540741 podman[205128]: 2025-12-01 09:28:12.810686449 +0000 UTC m=+0.071158169 container create 4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_cartwright, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default)
Dec  1 04:28:12 np0005540741 systemd[1]: Started libpod-conmon-4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145.scope.
Dec  1 04:28:12 np0005540741 podman[205128]: 2025-12-01 09:28:12.762957799 +0000 UTC m=+0.023429549 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:28:12 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:28:12 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/243094111e362cac9d172b3a21290e1c1a99794435ba016dd7ad7bd2ae2432f2/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:28:12 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/243094111e362cac9d172b3a21290e1c1a99794435ba016dd7ad7bd2ae2432f2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:28:12 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/243094111e362cac9d172b3a21290e1c1a99794435ba016dd7ad7bd2ae2432f2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:28:12 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/243094111e362cac9d172b3a21290e1c1a99794435ba016dd7ad7bd2ae2432f2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:28:12 np0005540741 python3.9[205122]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:12 np0005540741 podman[205128]: 2025-12-01 09:28:12.954087797 +0000 UTC m=+0.214559547 container init 4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_cartwright, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:28:12 np0005540741 podman[205128]: 2025-12-01 09:28:12.960366789 +0000 UTC m=+0.220838519 container start 4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_cartwright, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:28:12 np0005540741 podman[205128]: 2025-12-01 09:28:12.981361626 +0000 UTC m=+0.241833386 container attach 4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_cartwright, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:28:13
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'vms', '.mgr', 'images', 'cephfs.cephfs.meta', 'cephfs.cephfs.data', 'volumes']
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v488: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]: {
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:    "0": [
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:        {
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "devices": [
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "/dev/loop3"
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            ],
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "lv_name": "ceph_lv0",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "lv_size": "21470642176",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "name": "ceph_lv0",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "tags": {
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.cluster_name": "ceph",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.crush_device_class": "",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.encrypted": "0",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.osd_id": "0",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.type": "block",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.vdo": "0"
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            },
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "type": "block",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "vg_name": "ceph_vg0"
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:        }
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:    ],
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:    "1": [
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:        {
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "devices": [
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "/dev/loop4"
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            ],
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "lv_name": "ceph_lv1",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "lv_size": "21470642176",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "name": "ceph_lv1",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "tags": {
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.cluster_name": "ceph",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.crush_device_class": "",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.encrypted": "0",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.osd_id": "1",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.type": "block",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.vdo": "0"
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            },
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "type": "block",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "vg_name": "ceph_vg1"
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:        }
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:    ],
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:    "2": [
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:        {
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "devices": [
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "/dev/loop5"
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            ],
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "lv_name": "ceph_lv2",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "lv_size": "21470642176",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "name": "ceph_lv2",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "tags": {
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.cluster_name": "ceph",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.crush_device_class": "",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.encrypted": "0",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.osd_id": "2",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.type": "block",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:                "ceph.vdo": "0"
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            },
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "type": "block",
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:            "vg_name": "ceph_vg2"
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:        }
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]:    ]
Dec  1 04:28:13 np0005540741 laughing_cartwright[205145]: }
Dec  1 04:28:13 np0005540741 python3.9[205301]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:13 np0005540741 systemd[1]: libpod-4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145.scope: Deactivated successfully.
Dec  1 04:28:13 np0005540741 podman[205128]: 2025-12-01 09:28:13.768224835 +0000 UTC m=+1.028696585 container died 4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_cartwright, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  1 04:28:13 np0005540741 systemd[1]: var-lib-containers-storage-overlay-243094111e362cac9d172b3a21290e1c1a99794435ba016dd7ad7bd2ae2432f2-merged.mount: Deactivated successfully.
Dec  1 04:28:13 np0005540741 podman[205128]: 2025-12-01 09:28:13.833321048 +0000 UTC m=+1.093792778 container remove 4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_cartwright, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:28:13 np0005540741 systemd[1]: libpod-conmon-4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145.scope: Deactivated successfully.
Dec  1 04:28:13 np0005540741 ceph-mgr[75324]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3312476512
Dec  1 04:28:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:28:14 np0005540741 python3.9[205567]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:14 np0005540741 podman[205607]: 2025-12-01 09:28:14.469672864 +0000 UTC m=+0.041749399 container create 2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_elion, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec  1 04:28:14 np0005540741 systemd[1]: Started libpod-conmon-2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651.scope.
Dec  1 04:28:14 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:28:14 np0005540741 podman[205607]: 2025-12-01 09:28:14.452579059 +0000 UTC m=+0.024655594 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:28:14 np0005540741 podman[205607]: 2025-12-01 09:28:14.54941879 +0000 UTC m=+0.121495345 container init 2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_elion, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:28:14 np0005540741 podman[205607]: 2025-12-01 09:28:14.557583136 +0000 UTC m=+0.129659661 container start 2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_elion, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Dec  1 04:28:14 np0005540741 podman[205607]: 2025-12-01 09:28:14.560947204 +0000 UTC m=+0.133023729 container attach 2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_elion, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  1 04:28:14 np0005540741 infallible_elion[205646]: 167 167
Dec  1 04:28:14 np0005540741 systemd[1]: libpod-2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651.scope: Deactivated successfully.
Dec  1 04:28:14 np0005540741 podman[205607]: 2025-12-01 09:28:14.562678564 +0000 UTC m=+0.134755089 container died 2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_elion, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec  1 04:28:14 np0005540741 systemd[1]: var-lib-containers-storage-overlay-2c38ad948cdde6b23e2c9aeff377c33e8ce699284ae4a169ec8937ba05f6be81-merged.mount: Deactivated successfully.
Dec  1 04:28:14 np0005540741 podman[205607]: 2025-12-01 09:28:14.600792336 +0000 UTC m=+0.172868861 container remove 2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_elion, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  1 04:28:14 np0005540741 systemd[1]: libpod-conmon-2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651.scope: Deactivated successfully.
Dec  1 04:28:14 np0005540741 podman[205745]: 2025-12-01 09:28:14.769474225 +0000 UTC m=+0.041106800 container create a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_ishizaka, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:28:14 np0005540741 systemd[1]: Started libpod-conmon-a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c.scope.
Dec  1 04:28:14 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:28:14 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e36358037fbbb4de1271a434b2d15e138974d5f43c9b941f13b7524c32f92610/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:28:14 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e36358037fbbb4de1271a434b2d15e138974d5f43c9b941f13b7524c32f92610/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:28:14 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e36358037fbbb4de1271a434b2d15e138974d5f43c9b941f13b7524c32f92610/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:28:14 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e36358037fbbb4de1271a434b2d15e138974d5f43c9b941f13b7524c32f92610/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:28:14 np0005540741 podman[205745]: 2025-12-01 09:28:14.841666812 +0000 UTC m=+0.113299377 container init a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_ishizaka, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:28:14 np0005540741 podman[205745]: 2025-12-01 09:28:14.750689032 +0000 UTC m=+0.022321607 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:28:14 np0005540741 podman[205745]: 2025-12-01 09:28:14.852188426 +0000 UTC m=+0.123820991 container start a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_ishizaka, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True)
Dec  1 04:28:14 np0005540741 podman[205745]: 2025-12-01 09:28:14.855481512 +0000 UTC m=+0.127114087 container attach a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_ishizaka, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:28:15 np0005540741 python3.9[205816]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v489: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:15 np0005540741 python3.9[205970]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]: {
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:        "osd_id": 0,
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:        "type": "bluestore"
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:    },
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:        "osd_id": 1,
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:        "type": "bluestore"
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:    },
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:        "osd_id": 2,
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:        "type": "bluestore"
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]:    }
Dec  1 04:28:15 np0005540741 quirky_ishizaka[205805]: }
Dec  1 04:28:15 np0005540741 systemd[1]: libpod-a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c.scope: Deactivated successfully.
Dec  1 04:28:15 np0005540741 podman[205745]: 2025-12-01 09:28:15.901085565 +0000 UTC m=+1.172718110 container died a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_ishizaka, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec  1 04:28:15 np0005540741 systemd[1]: libpod-a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c.scope: Consumed 1.053s CPU time.
Dec  1 04:28:15 np0005540741 systemd[1]: var-lib-containers-storage-overlay-e36358037fbbb4de1271a434b2d15e138974d5f43c9b941f13b7524c32f92610-merged.mount: Deactivated successfully.
Dec  1 04:28:15 np0005540741 podman[205745]: 2025-12-01 09:28:15.954136669 +0000 UTC m=+1.225769224 container remove a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_ishizaka, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:28:15 np0005540741 systemd[1]: libpod-conmon-a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c.scope: Deactivated successfully.
Dec  1 04:28:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:28:15 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:28:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:28:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:28:16 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 103817a1-9080-4f6d-9587-583b9e2f4d5e does not exist
Dec  1 04:28:16 np0005540741 python3.9[206212]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:16 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:28:16 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:28:17 np0005540741 python3.9[206364]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v490: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:17 np0005540741 python3.9[206516]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:18 np0005540741 python3.9[206639]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581297.3584514-775-150555474384716/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:28:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:28:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:28:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:28:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:28:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:28:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:28:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:28:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:28:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:28:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:28:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:28:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:28:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:28:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:28:19 np0005540741 python3.9[206791]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:28:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v491: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:19 np0005540741 podman[206886]: 2025-12-01 09:28:19.585402168 +0000 UTC m=+0.101512137 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 04:28:19 np0005540741 python3.9[206930]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581298.6639555-775-41504929997351/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:20 np0005540741 python3.9[207091]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:28:20.461 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:28:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:28:20.462 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:28:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:28:20.462 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:28:20 np0005540741 python3.9[207214]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581299.845651-775-69273951457665/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v492: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:21 np0005540741 python3.9[207366]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:21 np0005540741 python3.9[207489]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581300.9906926-775-103812361140376/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:22 np0005540741 python3.9[207641]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:23 np0005540741 python3.9[207764]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581302.183005-775-163393557397729/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v493: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:24 np0005540741 python3.9[207916]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:28:24 np0005540741 podman[208011]: 2025-12-01 09:28:24.40818252 +0000 UTC m=+0.050627575 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Dec  1 04:28:24 np0005540741 python3.9[208058]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581303.4017506-775-93537649270581/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:25 np0005540741 python3.9[208210]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v494: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:25 np0005540741 python3.9[208333]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581304.7469008-775-160116931691371/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:26 np0005540741 python3.9[208485]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:26 np0005540741 python3.9[208608]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581305.9030778-775-91481268059870/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v495: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:27 np0005540741 python3.9[208760]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:27 np0005540741 python3.9[208883]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581307.0551562-775-110121608971846/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:28 np0005540741 python3.9[209035]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:29 np0005540741 python3.9[209158]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581308.1291695-775-78100187418647/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:28:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v496: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:29 np0005540741 python3.9[209310]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:30 np0005540741 python3.9[209433]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581309.463916-775-43498553648316/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:31 np0005540741 python3.9[209585]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v497: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:31 np0005540741 python3.9[209708]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581310.7062364-775-198040237461578/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:32 np0005540741 python3.9[209860]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:32 np0005540741 python3.9[209983]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581311.9012904-775-181873014793249/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v498: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:33 np0005540741 python3.9[210135]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:34 np0005540741 python3.9[210258]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581313.0578291-775-181381597770655/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:28:34 np0005540741 python3.9[210408]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:28:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v499: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:35 np0005540741 python3.9[210563]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec  1 04:28:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v500: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:37 np0005540741 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec  1 04:28:37 np0005540741 python3.9[210719]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:38 np0005540741 python3.9[210871]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:28:39 np0005540741 python3.9[211023]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v501: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:39 np0005540741 python3.9[211175]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:40 np0005540741 python3.9[211327]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:41 np0005540741 python3.9[211479]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v502: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:41 np0005540741 python3.9[211631]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:42 np0005540741 python3.9[211783]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:28:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:28:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:28:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:28:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:28:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:28:43 np0005540741 python3.9[211935]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v503: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:43 np0005540741 python3.9[212087]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:28:44 np0005540741 python3.9[212239]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:28:44 np0005540741 systemd[1]: Reloading.
Dec  1 04:28:44 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:28:44 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #27. Immutable memtables: 0.
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.793724) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 27
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581324793781, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 2035, "num_deletes": 251, "total_data_size": 2345115, "memory_usage": 2391208, "flush_reason": "Manual Compaction"}
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #28: started
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581324809155, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 28, "file_size": 2274390, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8982, "largest_seqno": 11016, "table_properties": {"data_size": 2265213, "index_size": 5799, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 17692, "raw_average_key_size": 19, "raw_value_size": 2246934, "raw_average_value_size": 2466, "num_data_blocks": 267, "num_entries": 911, "num_filter_entries": 911, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764581091, "oldest_key_time": 1764581091, "file_creation_time": 1764581324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 28, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 15473 microseconds, and 6717 cpu microseconds.
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.809213) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #28: 2274390 bytes OK
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.809238) [db/memtable_list.cc:519] [default] Level-0 commit table #28 started
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.810768) [db/memtable_list.cc:722] [default] Level-0 commit table #28: memtable #1 done
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.810783) EVENT_LOG_v1 {"time_micros": 1764581324810779, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.810801) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 2336643, prev total WAL file size 2336643, number of live WAL files 2.
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000024.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.812559) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [28(2221KB)], [26(4517KB)]
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581324812594, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [28], "files_L6": [26], "score": -1, "input_data_size": 6900475, "oldest_snapshot_seqno": -1}
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #29: 3183 keys, 5798191 bytes, temperature: kUnknown
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581324851388, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 29, "file_size": 5798191, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5772889, "index_size": 16233, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8005, "raw_key_size": 73570, "raw_average_key_size": 23, "raw_value_size": 5711862, "raw_average_value_size": 1794, "num_data_blocks": 718, "num_entries": 3183, "num_filter_entries": 3183, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764581324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.851665) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 5798191 bytes
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.853026) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 177.5 rd, 149.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 4.4 +0.0 blob) out(5.5 +0.0 blob), read-write-amplify(5.6) write-amplify(2.5) OK, records in: 3697, records dropped: 514 output_compression: NoCompression
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.853048) EVENT_LOG_v1 {"time_micros": 1764581324853038, "job": 10, "event": "compaction_finished", "compaction_time_micros": 38876, "compaction_time_cpu_micros": 15183, "output_level": 6, "num_output_files": 1, "total_output_size": 5798191, "num_input_records": 3697, "num_output_records": 3183, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000028.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581324853735, "job": 10, "event": "table_file_deletion", "file_number": 28}
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581324854953, "job": 10, "event": "table_file_deletion", "file_number": 26}
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.812420) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.855134) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.855141) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.855143) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.855144) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:28:44 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.855146) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:28:45 np0005540741 systemd[1]: Starting libvirt logging daemon socket...
Dec  1 04:28:45 np0005540741 systemd[1]: Listening on libvirt logging daemon socket.
Dec  1 04:28:45 np0005540741 systemd[1]: Starting libvirt logging daemon admin socket...
Dec  1 04:28:45 np0005540741 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec  1 04:28:45 np0005540741 systemd[1]: Starting libvirt logging daemon...
Dec  1 04:28:45 np0005540741 systemd[1]: Started libvirt logging daemon.
Dec  1 04:28:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v504: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:45 np0005540741 python3.9[212433]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:28:46 np0005540741 systemd[1]: Reloading.
Dec  1 04:28:46 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:28:46 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:28:46 np0005540741 systemd[1]: Starting libvirt nodedev daemon socket...
Dec  1 04:28:46 np0005540741 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec  1 04:28:46 np0005540741 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec  1 04:28:46 np0005540741 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec  1 04:28:46 np0005540741 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec  1 04:28:46 np0005540741 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec  1 04:28:46 np0005540741 systemd[1]: Starting libvirt nodedev daemon...
Dec  1 04:28:46 np0005540741 systemd[1]: Started libvirt nodedev daemon.
Dec  1 04:28:47 np0005540741 python3.9[212649]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:28:47 np0005540741 systemd[1]: Reloading.
Dec  1 04:28:47 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:28:47 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:28:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v505: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:47 np0005540741 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec  1 04:28:47 np0005540741 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec  1 04:28:47 np0005540741 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec  1 04:28:47 np0005540741 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec  1 04:28:47 np0005540741 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec  1 04:28:47 np0005540741 systemd[1]: Starting libvirt proxy daemon...
Dec  1 04:28:47 np0005540741 systemd[1]: Started libvirt proxy daemon.
Dec  1 04:28:47 np0005540741 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec  1 04:28:47 np0005540741 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec  1 04:28:48 np0005540741 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec  1 04:28:48 np0005540741 python3.9[212867]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:28:48 np0005540741 systemd[1]: Reloading.
Dec  1 04:28:48 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:28:48 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:28:48 np0005540741 systemd[1]: Listening on libvirt locking daemon socket.
Dec  1 04:28:48 np0005540741 systemd[1]: Starting libvirt QEMU daemon socket...
Dec  1 04:28:48 np0005540741 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec  1 04:28:48 np0005540741 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec  1 04:28:48 np0005540741 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec  1 04:28:48 np0005540741 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec  1 04:28:48 np0005540741 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec  1 04:28:48 np0005540741 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec  1 04:28:48 np0005540741 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec  1 04:28:48 np0005540741 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec  1 04:28:48 np0005540741 systemd[1]: Starting libvirt QEMU daemon...
Dec  1 04:28:48 np0005540741 systemd[1]: Started libvirt QEMU daemon.
Dec  1 04:28:48 np0005540741 setroubleshoot[212685]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 1e55f597-2f82-4df1-b992-f49fa7fcf036
Dec  1 04:28:48 np0005540741 setroubleshoot[212685]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  1 04:28:48 np0005540741 setroubleshoot[212685]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 1e55f597-2f82-4df1-b992-f49fa7fcf036
Dec  1 04:28:48 np0005540741 setroubleshoot[212685]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  1 04:28:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:28:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v506: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:49 np0005540741 python3.9[213084]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:28:49 np0005540741 systemd[1]: Reloading.
Dec  1 04:28:49 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:28:49 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:28:49 np0005540741 podman[213086]: 2025-12-01 09:28:49.842913836 +0000 UTC m=+0.118459138 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:28:50 np0005540741 systemd[1]: Starting libvirt secret daemon socket...
Dec  1 04:28:50 np0005540741 systemd[1]: Listening on libvirt secret daemon socket.
Dec  1 04:28:50 np0005540741 systemd[1]: Starting libvirt secret daemon admin socket...
Dec  1 04:28:50 np0005540741 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec  1 04:28:50 np0005540741 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec  1 04:28:50 np0005540741 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec  1 04:28:50 np0005540741 systemd[1]: Starting libvirt secret daemon...
Dec  1 04:28:50 np0005540741 systemd[1]: Started libvirt secret daemon.
Dec  1 04:28:50 np0005540741 python3.9[213322]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v507: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:51 np0005540741 python3.9[213474]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 04:28:52 np0005540741 python3.9[213626]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:28:52 np0005540741 python3.9[213780]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 04:28:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v508: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:53 np0005540741 python3.9[213930]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:28:54 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:28:54 np0005540741 python3.9[214051]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581333.3169858-1133-99746231312967/.source.xml follow=False _original_basename=secret.xml.j2 checksum=972bce57f1b968e3bea30025319af4764744aa0e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:54 np0005540741 podman[214175]: 2025-12-01 09:28:54.842206052 +0000 UTC m=+0.050975575 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec  1 04:28:55 np0005540741 python3.9[214220]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 5620a9fb-e540-5250-a0e8-7aaad5347e3b#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:28:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v509: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:55 np0005540741 python3.9[214382]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v510: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:57 np0005540741 python3.9[214845]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:28:59 np0005540741 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec  1 04:28:59 np0005540741 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec  1 04:28:59 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:28:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v511: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:28:59 np0005540741 python3.9[214997]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:29:00 np0005540741 python3.9[215120]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581337.9535317-1188-28803400859215/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:01 np0005540741 python3.9[215272]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v512: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:02 np0005540741 python3.9[215424]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:29:02 np0005540741 python3.9[215502]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:03 np0005540741 python3.9[215654]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:29:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v513: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:03 np0005540741 python3.9[215732]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.287xdcm9 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:04 np0005540741 python3.9[215884]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:29:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:29:04 np0005540741 python3.9[215962]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v514: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:05 np0005540741 python3.9[216114]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:29:06 np0005540741 python3[216267]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  1 04:29:07 np0005540741 python3.9[216419]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:29:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v515: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:07 np0005540741 python3.9[216497]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:08 np0005540741 python3.9[216649]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:29:08 np0005540741 python3.9[216727]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:29:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v516: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:09 np0005540741 python3.9[216879]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:29:10 np0005540741 python3.9[216957]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:10 np0005540741 python3.9[217109]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:29:11 np0005540741 python3.9[217187]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v517: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:11 np0005540741 python3.9[217339]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:29:12 np0005540741 python3.9[217464]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581351.4179072-1313-134781417579801/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:29:13
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['vms', 'volumes', 'cephfs.cephfs.data', 'images', '.mgr', 'backups', 'cephfs.cephfs.meta']
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:29:13 np0005540741 python3.9[217616]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:29:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v518: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:13 np0005540741 python3.9[217768]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:29:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:29:14 np0005540741 python3.9[217923]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:15 np0005540741 python3.9[218075]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:29:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v519: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:16 np0005540741 python3.9[218228]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:29:16 np0005540741 python3.9[218482]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:29:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:29:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:29:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:29:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:29:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:29:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:29:16 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 56aff331-8ab8-4dd8-a9b0-f5a9d59847b8 does not exist
Dec  1 04:29:16 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev a19cb7a8-3e7d-4a95-b2a4-e975d2443207 does not exist
Dec  1 04:29:16 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 96e88a56-2d0e-42e8-be95-674d6adfda93 does not exist
Dec  1 04:29:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:29:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:29:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:29:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:29:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:29:16 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:29:17 np0005540741 python3.9[218768]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:17 np0005540741 podman[218809]: 2025-12-01 09:29:17.347555627 +0000 UTC m=+0.047503435 container create e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_curie, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:29:17 np0005540741 systemd[1]: Started libpod-conmon-e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587.scope.
Dec  1 04:29:17 np0005540741 podman[218809]: 2025-12-01 09:29:17.329195266 +0000 UTC m=+0.029143094 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:29:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v520: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:17 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:29:17 np0005540741 podman[218809]: 2025-12-01 09:29:17.443132872 +0000 UTC m=+0.143080710 container init e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_curie, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec  1 04:29:17 np0005540741 podman[218809]: 2025-12-01 09:29:17.451454033 +0000 UTC m=+0.151401851 container start e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_curie, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:29:17 np0005540741 podman[218809]: 2025-12-01 09:29:17.455352765 +0000 UTC m=+0.155300613 container attach e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_curie, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec  1 04:29:17 np0005540741 tender_curie[218850]: 167 167
Dec  1 04:29:17 np0005540741 systemd[1]: libpod-e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587.scope: Deactivated successfully.
Dec  1 04:29:17 np0005540741 podman[218809]: 2025-12-01 09:29:17.457839817 +0000 UTC m=+0.157787625 container died e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_curie, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:29:17 np0005540741 systemd[1]: var-lib-containers-storage-overlay-aba983264edde1c34908a495a9f031eac5688654f1e505d1aaf74f11d7b72f2b-merged.mount: Deactivated successfully.
Dec  1 04:29:17 np0005540741 podman[218809]: 2025-12-01 09:29:17.49493174 +0000 UTC m=+0.194879548 container remove e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_curie, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec  1 04:29:17 np0005540741 systemd[1]: libpod-conmon-e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587.scope: Deactivated successfully.
Dec  1 04:29:17 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:29:17 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:29:17 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:29:17 np0005540741 podman[218939]: 2025-12-01 09:29:17.661356554 +0000 UTC m=+0.041444630 container create 61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_jennings, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:29:17 np0005540741 systemd[1]: Started libpod-conmon-61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e.scope.
Dec  1 04:29:17 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:29:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc20b70bfeb344c8050da1f33b07de8ec7ef6f4c82709c28dfe62118018a5927/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:29:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc20b70bfeb344c8050da1f33b07de8ec7ef6f4c82709c28dfe62118018a5927/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:29:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc20b70bfeb344c8050da1f33b07de8ec7ef6f4c82709c28dfe62118018a5927/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:29:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc20b70bfeb344c8050da1f33b07de8ec7ef6f4c82709c28dfe62118018a5927/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:29:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc20b70bfeb344c8050da1f33b07de8ec7ef6f4c82709c28dfe62118018a5927/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:29:17 np0005540741 podman[218939]: 2025-12-01 09:29:17.735006824 +0000 UTC m=+0.115094920 container init 61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_jennings, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec  1 04:29:17 np0005540741 podman[218939]: 2025-12-01 09:29:17.644345972 +0000 UTC m=+0.024434048 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:29:17 np0005540741 podman[218939]: 2025-12-01 09:29:17.747034822 +0000 UTC m=+0.127122908 container start 61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_jennings, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:29:17 np0005540741 podman[218939]: 2025-12-01 09:29:17.75076071 +0000 UTC m=+0.130848786 container attach 61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_jennings, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  1 04:29:17 np0005540741 python3.9[219022]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:29:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:29:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:29:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:29:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:29:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:29:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:29:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:29:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:29:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:29:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:29:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:29:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:29:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:29:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:29:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:29:18 np0005540741 python3.9[219145]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581357.5100846-1385-209251408862537/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:18 np0005540741 intelligent_jennings[218989]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:29:18 np0005540741 intelligent_jennings[218989]: --> relative data size: 1.0
Dec  1 04:29:18 np0005540741 intelligent_jennings[218989]: --> All data devices are unavailable
Dec  1 04:29:19 np0005540741 systemd[1]: libpod-61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e.scope: Deactivated successfully.
Dec  1 04:29:19 np0005540741 podman[218939]: 2025-12-01 09:29:19.010922928 +0000 UTC m=+1.391011004 container died 61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_jennings, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:29:19 np0005540741 systemd[1]: libpod-61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e.scope: Consumed 1.141s CPU time.
Dec  1 04:29:19 np0005540741 python3.9[219331]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:29:19 np0005540741 systemd[1]: var-lib-containers-storage-overlay-cc20b70bfeb344c8050da1f33b07de8ec7ef6f4c82709c28dfe62118018a5927-merged.mount: Deactivated successfully.
Dec  1 04:29:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:29:19 np0005540741 podman[218939]: 2025-12-01 09:29:19.335196627 +0000 UTC m=+1.715284713 container remove 61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_jennings, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec  1 04:29:19 np0005540741 systemd[1]: libpod-conmon-61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e.scope: Deactivated successfully.
Dec  1 04:29:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v521: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:19 np0005540741 python3.9[219536]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581358.7448077-1400-265868212999446/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:19 np0005540741 podman[219620]: 2025-12-01 09:29:19.901186267 +0000 UTC m=+0.029518114 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:29:19 np0005540741 podman[219620]: 2025-12-01 09:29:19.994778874 +0000 UTC m=+0.123110701 container create be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_tharp, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec  1 04:29:20 np0005540741 systemd[1]: Started libpod-conmon-be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8.scope.
Dec  1 04:29:20 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:29:20 np0005540741 podman[219620]: 2025-12-01 09:29:20.08451456 +0000 UTC m=+0.212846407 container init be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_tharp, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:29:20 np0005540741 podman[219620]: 2025-12-01 09:29:20.092363657 +0000 UTC m=+0.220695484 container start be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_tharp, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:29:20 np0005540741 interesting_tharp[219689]: 167 167
Dec  1 04:29:20 np0005540741 systemd[1]: libpod-be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8.scope: Deactivated successfully.
Dec  1 04:29:20 np0005540741 podman[219620]: 2025-12-01 09:29:20.162135205 +0000 UTC m=+0.290467032 container attach be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_tharp, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:29:20 np0005540741 podman[219620]: 2025-12-01 09:29:20.163402362 +0000 UTC m=+0.291734209 container died be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_tharp, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:29:20 np0005540741 systemd[1]: var-lib-containers-storage-overlay-1b7e60088397a842e833bdd0c25bd91d1a5e276e5a6a8eb47b2f705dbb9b39e5-merged.mount: Deactivated successfully.
Dec  1 04:29:20 np0005540741 podman[219620]: 2025-12-01 09:29:20.228538156 +0000 UTC m=+0.356869983 container remove be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_tharp, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:29:20 np0005540741 systemd[1]: libpod-conmon-be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8.scope: Deactivated successfully.
Dec  1 04:29:20 np0005540741 podman[219690]: 2025-12-01 09:29:20.312166294 +0000 UTC m=+0.237717386 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec  1 04:29:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:29:20.462 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:29:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:29:20.462 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:29:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:29:20.463 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:29:20 np0005540741 python3.9[219805]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:29:20 np0005540741 podman[219817]: 2025-12-01 09:29:20.392767886 +0000 UTC m=+0.025918511 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:29:20 np0005540741 podman[219817]: 2025-12-01 09:29:20.493362465 +0000 UTC m=+0.126513080 container create 76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_beaver, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec  1 04:29:20 np0005540741 systemd[1]: Started libpod-conmon-76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74.scope.
Dec  1 04:29:20 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:29:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53f8cdfa72f7e468458f2834eac30a4bf8983ddd5fd1574bcde880ae35e2bbb0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:29:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53f8cdfa72f7e468458f2834eac30a4bf8983ddd5fd1574bcde880ae35e2bbb0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:29:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53f8cdfa72f7e468458f2834eac30a4bf8983ddd5fd1574bcde880ae35e2bbb0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:29:20 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53f8cdfa72f7e468458f2834eac30a4bf8983ddd5fd1574bcde880ae35e2bbb0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:29:20 np0005540741 podman[219817]: 2025-12-01 09:29:20.700155367 +0000 UTC m=+0.333305972 container init 76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_beaver, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec  1 04:29:20 np0005540741 podman[219817]: 2025-12-01 09:29:20.707960752 +0000 UTC m=+0.341111327 container start 76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_beaver, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:29:20 np0005540741 podman[219817]: 2025-12-01 09:29:20.719949309 +0000 UTC m=+0.353099904 container attach 76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_beaver, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:29:21 np0005540741 python3.9[219961]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581359.9236312-1415-91334037796122/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v522: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:21 np0005540741 strange_beaver[219841]: {
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:    "0": [
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:        {
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "devices": [
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "/dev/loop3"
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            ],
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "lv_name": "ceph_lv0",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "lv_size": "21470642176",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "name": "ceph_lv0",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "tags": {
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.cluster_name": "ceph",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.crush_device_class": "",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.encrypted": "0",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.osd_id": "0",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.type": "block",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.vdo": "0"
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            },
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "type": "block",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "vg_name": "ceph_vg0"
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:        }
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:    ],
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:    "1": [
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:        {
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "devices": [
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "/dev/loop4"
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            ],
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "lv_name": "ceph_lv1",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "lv_size": "21470642176",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "name": "ceph_lv1",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "tags": {
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.cluster_name": "ceph",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.crush_device_class": "",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.encrypted": "0",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.osd_id": "1",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.type": "block",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.vdo": "0"
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            },
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "type": "block",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "vg_name": "ceph_vg1"
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:        }
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:    ],
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:    "2": [
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:        {
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "devices": [
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "/dev/loop5"
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            ],
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "lv_name": "ceph_lv2",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "lv_size": "21470642176",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "name": "ceph_lv2",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "tags": {
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.cluster_name": "ceph",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.crush_device_class": "",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.encrypted": "0",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.osd_id": "2",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.type": "block",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:                "ceph.vdo": "0"
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            },
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "type": "block",
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:            "vg_name": "ceph_vg2"
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:        }
Dec  1 04:29:21 np0005540741 strange_beaver[219841]:    ]
Dec  1 04:29:21 np0005540741 strange_beaver[219841]: }
Dec  1 04:29:21 np0005540741 systemd[1]: libpod-76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74.scope: Deactivated successfully.
Dec  1 04:29:21 np0005540741 podman[219817]: 2025-12-01 09:29:21.562177039 +0000 UTC m=+1.195327615 container died 76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_beaver, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:29:21 np0005540741 systemd[1]: var-lib-containers-storage-overlay-53f8cdfa72f7e468458f2834eac30a4bf8983ddd5fd1574bcde880ae35e2bbb0-merged.mount: Deactivated successfully.
Dec  1 04:29:21 np0005540741 podman[219817]: 2025-12-01 09:29:21.781534284 +0000 UTC m=+1.414684879 container remove 76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_beaver, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:29:21 np0005540741 systemd[1]: libpod-conmon-76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74.scope: Deactivated successfully.
Dec  1 04:29:22 np0005540741 python3.9[220144]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:29:22 np0005540741 systemd[1]: Reloading.
Dec  1 04:29:22 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:29:22 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:29:22 np0005540741 podman[220306]: 2025-12-01 09:29:22.384539885 +0000 UTC m=+0.040291516 container create 49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_swirles, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:29:22 np0005540741 podman[220306]: 2025-12-01 09:29:22.365819724 +0000 UTC m=+0.021571375 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:29:22 np0005540741 systemd[1]: Started libpod-conmon-49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a.scope.
Dec  1 04:29:22 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:29:22 np0005540741 systemd[1]: Reached target edpm_libvirt.target.
Dec  1 04:29:22 np0005540741 podman[220306]: 2025-12-01 09:29:22.712655696 +0000 UTC m=+0.368407367 container init 49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_swirles, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:29:22 np0005540741 podman[220306]: 2025-12-01 09:29:22.720670728 +0000 UTC m=+0.376422359 container start 49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_swirles, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:29:22 np0005540741 sad_swirles[220323]: 167 167
Dec  1 04:29:22 np0005540741 podman[220306]: 2025-12-01 09:29:22.724731425 +0000 UTC m=+0.380483056 container attach 49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_swirles, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:29:22 np0005540741 systemd[1]: libpod-49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a.scope: Deactivated successfully.
Dec  1 04:29:22 np0005540741 podman[220306]: 2025-12-01 09:29:22.729614556 +0000 UTC m=+0.385366207 container died 49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_swirles, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:29:22 np0005540741 systemd[1]: var-lib-containers-storage-overlay-ac4c2638152228515c012ceb77ac5fc747b57c8854ef405a1384b4a73a3545ea-merged.mount: Deactivated successfully.
Dec  1 04:29:22 np0005540741 podman[220306]: 2025-12-01 09:29:22.845889139 +0000 UTC m=+0.501640770 container remove 49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_swirles, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS)
Dec  1 04:29:22 np0005540741 systemd[1]: libpod-conmon-49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a.scope: Deactivated successfully.
Dec  1 04:29:23 np0005540741 podman[220452]: 2025-12-01 09:29:22.977914987 +0000 UTC m=+0.023279724 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:29:23 np0005540741 podman[220452]: 2025-12-01 09:29:23.157525202 +0000 UTC m=+0.202889909 container create 497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_wozniak, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec  1 04:29:23 np0005540741 systemd[1]: Started libpod-conmon-497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88.scope.
Dec  1 04:29:23 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:29:23 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f906a6faa141318a3c68c444321de5dd4d57bf1a196498ed6159c5ea461e2f5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:29:23 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f906a6faa141318a3c68c444321de5dd4d57bf1a196498ed6159c5ea461e2f5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:29:23 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f906a6faa141318a3c68c444321de5dd4d57bf1a196498ed6159c5ea461e2f5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:29:23 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f906a6faa141318a3c68c444321de5dd4d57bf1a196498ed6159c5ea461e2f5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:29:23 np0005540741 podman[220452]: 2025-12-01 09:29:23.263811326 +0000 UTC m=+0.309176063 container init 497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_wozniak, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef)
Dec  1 04:29:23 np0005540741 podman[220452]: 2025-12-01 09:29:23.272487777 +0000 UTC m=+0.317852484 container start 497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_wozniak, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec  1 04:29:23 np0005540741 python3.9[220516]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  1 04:29:23 np0005540741 podman[220452]: 2025-12-01 09:29:23.336692594 +0000 UTC m=+0.382057331 container attach 497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_wozniak, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:29:23 np0005540741 systemd[1]: Reloading.
Dec  1 04:29:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v523: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:23 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:29:23 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:29:23 np0005540741 systemd[1]: Reloading.
Dec  1 04:29:23 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:29:23 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]: {
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:        "osd_id": 0,
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:        "type": "bluestore"
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:    },
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:        "osd_id": 1,
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:        "type": "bluestore"
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:    },
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:        "osd_id": 2,
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:        "type": "bluestore"
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]:    }
Dec  1 04:29:24 np0005540741 keen_wozniak[220519]: }
Dec  1 04:29:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:29:24 np0005540741 systemd[1]: libpod-497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88.scope: Deactivated successfully.
Dec  1 04:29:24 np0005540741 systemd[1]: libpod-497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88.scope: Consumed 1.057s CPU time.
Dec  1 04:29:24 np0005540741 podman[220452]: 2025-12-01 09:29:24.345219285 +0000 UTC m=+1.390583992 container died 497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_wozniak, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec  1 04:29:24 np0005540741 systemd[1]: session-49.scope: Deactivated successfully.
Dec  1 04:29:24 np0005540741 systemd[1]: session-49.scope: Consumed 3min 29.176s CPU time.
Dec  1 04:29:24 np0005540741 systemd-logind[788]: Session 49 logged out. Waiting for processes to exit.
Dec  1 04:29:24 np0005540741 systemd-logind[788]: Removed session 49.
Dec  1 04:29:24 np0005540741 systemd[1]: var-lib-containers-storage-overlay-1f906a6faa141318a3c68c444321de5dd4d57bf1a196498ed6159c5ea461e2f5-merged.mount: Deactivated successfully.
Dec  1 04:29:24 np0005540741 podman[220452]: 2025-12-01 09:29:24.687975129 +0000 UTC m=+1.733339846 container remove 497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_wozniak, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:29:24 np0005540741 systemd[1]: libpod-conmon-497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88.scope: Deactivated successfully.
Dec  1 04:29:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:29:24 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:29:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:29:24 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:29:24 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 95a0ece0-f8d3-4d5a-9897-fa9ce172ac75 does not exist
Dec  1 04:29:24 np0005540741 podman[220709]: 2025-12-01 09:29:24.951557443 +0000 UTC m=+0.064936410 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  1 04:29:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v524: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:25 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:29:25 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:29:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v525: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:29:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v526: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:30 np0005540741 systemd-logind[788]: New session 50 of user zuul.
Dec  1 04:29:30 np0005540741 systemd[1]: Started Session 50 of User zuul.
Dec  1 04:29:31 np0005540741 python3.9[220882]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:29:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v527: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:32 np0005540741 python3.9[221036]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:29:32 np0005540741 network[221053]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:29:32 np0005540741 network[221054]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:29:32 np0005540741 network[221055]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:29:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v528: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:29:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v529: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:36 np0005540741 python3.9[221327]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:29:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v530: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:37 np0005540741 python3.9[221411]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:29:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:29:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v531: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v532: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:29:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:29:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:29:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:29:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:29:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:29:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v533: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:43 np0005540741 python3.9[221564]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:29:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:29:45 np0005540741 python3.9[221716]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:29:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v534: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:45 np0005540741 python3.9[221869]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:29:46 np0005540741 python3.9[222021]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:29:47 np0005540741 python3.9[222174]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:29:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v535: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:47 np0005540741 python3.9[222297]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581386.6061609-95-64919500063399/.source.iscsi _original_basename=.d9ygdvds follow=False checksum=8b34ecf17114dfe93c0af71f0eb5d4d0f9d9b273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:48 np0005540741 python3.9[222449]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:29:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v536: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:49 np0005540741 python3.9[222601]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:29:50 np0005540741 podman[222725]: 2025-12-01 09:29:50.578467478 +0000 UTC m=+0.165997272 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 04:29:50 np0005540741 python3.9[222771]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:29:50 np0005540741 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec  1 04:29:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v537: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:51 np0005540741 python3.9[222933]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:29:51 np0005540741 systemd[1]: Reloading.
Dec  1 04:29:51 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:29:51 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:29:52 np0005540741 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  1 04:29:52 np0005540741 systemd[1]: Starting Open-iSCSI...
Dec  1 04:29:52 np0005540741 kernel: Loading iSCSI transport class v2.0-870.
Dec  1 04:29:52 np0005540741 systemd[1]: Started Open-iSCSI.
Dec  1 04:29:52 np0005540741 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec  1 04:29:52 np0005540741 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec  1 04:29:52 np0005540741 python3.9[223133]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:29:53 np0005540741 network[223150]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:29:53 np0005540741 network[223151]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:29:53 np0005540741 network[223152]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:29:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v538: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:54 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:29:55 np0005540741 podman[223206]: 2025-12-01 09:29:55.051230985 +0000 UTC m=+0.055970259 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec  1 04:29:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v539: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:57 np0005540741 python3.9[223442]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  1 04:29:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v540: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:58 np0005540741 python3.9[223594]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec  1 04:29:59 np0005540741 python3.9[223750]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:29:59 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:29:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v541: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:29:59 np0005540741 python3.9[223873]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581398.5807467-172-228270595855104/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:00 np0005540741 python3.9[224025]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v542: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:01 np0005540741 python3.9[224177]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:30:01 np0005540741 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  1 04:30:01 np0005540741 systemd[1]: Stopped Load Kernel Modules.
Dec  1 04:30:01 np0005540741 systemd[1]: Stopping Load Kernel Modules...
Dec  1 04:30:01 np0005540741 systemd[1]: Starting Load Kernel Modules...
Dec  1 04:30:01 np0005540741 systemd[1]: Finished Load Kernel Modules.
Dec  1 04:30:02 np0005540741 python3.9[224333]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:30:03 np0005540741 python3.9[224485]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:30:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v543: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:03 np0005540741 python3.9[224637]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:30:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:30:04 np0005540741 python3.9[224789]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:30:04 np0005540741 python3.9[224912]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581403.929408-230-256672124374212/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v544: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:05 np0005540741 python3.9[225064]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:30:06 np0005540741 python3.9[225217]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:07 np0005540741 python3.9[225369]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v545: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:07 np0005540741 python3.9[225521]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:08 np0005540741 python3.9[225673]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:08 np0005540741 python3.9[225825]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:30:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v546: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:09 np0005540741 python3.9[225977]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:10 np0005540741 python3.9[226129]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:10 np0005540741 python3.9[226281]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:30:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v547: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:11 np0005540741 python3.9[226435]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:12 np0005540741 python3.9[226587]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:30:13
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['vms', 'backups', 'images', '.mgr', 'cephfs.cephfs.data', 'volumes', 'cephfs.cephfs.meta']
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:30:13 np0005540741 python3.9[226739]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:30:13 np0005540741 python3.9[226817]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:30:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v548: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:14 np0005540741 python3.9[226969]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:30:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:30:14 np0005540741 python3.9[227047]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:30:15 np0005540741 python3.9[227199]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v549: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:16 np0005540741 python3.9[227351]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:30:16 np0005540741 python3.9[227429]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:17 np0005540741 python3.9[227581]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:30:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v550: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:17 np0005540741 python3.9[227659]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:18 np0005540741 python3.9[227811]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:30:18 np0005540741 systemd[1]: Reloading.
Dec  1 04:30:18 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:30:18 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:30:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:30:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:30:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:30:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:30:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:30:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:30:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:30:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:30:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:30:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:30:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:30:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:30:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:30:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:30:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:30:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:30:19 np0005540741 python3.9[228000]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:30:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v551: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:19 np0005540741 python3.9[228078]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:30:20.463 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:30:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:30:20.464 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:30:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:30:20.465 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:30:20 np0005540741 python3.9[228230]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:30:21 np0005540741 podman[228280]: 2025-12-01 09:30:21.052734163 +0000 UTC m=+0.157044343 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  1 04:30:21 np0005540741 python3.9[228319]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v552: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:22 np0005540741 python3.9[228486]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:30:22 np0005540741 systemd[1]: Reloading.
Dec  1 04:30:22 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:30:22 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:30:22 np0005540741 systemd[1]: Starting Create netns directory...
Dec  1 04:30:22 np0005540741 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  1 04:30:22 np0005540741 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  1 04:30:22 np0005540741 systemd[1]: Finished Create netns directory.
Dec  1 04:30:23 np0005540741 python3.9[228679]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:30:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v553: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:23 np0005540741 python3.9[228831]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #30. Immutable memtables: 0.
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.347037) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 30
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581424347089, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 992, "num_deletes": 251, "total_data_size": 962242, "memory_usage": 981456, "flush_reason": "Manual Compaction"}
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #31: started
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581424353519, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 31, "file_size": 585179, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11017, "largest_seqno": 12008, "table_properties": {"data_size": 581381, "index_size": 1514, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9682, "raw_average_key_size": 19, "raw_value_size": 573173, "raw_average_value_size": 1179, "num_data_blocks": 69, "num_entries": 486, "num_filter_entries": 486, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764581325, "oldest_key_time": 1764581325, "file_creation_time": 1764581424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 31, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 6511 microseconds, and 2929 cpu microseconds.
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.353560) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #31: 585179 bytes OK
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.353581) [db/memtable_list.cc:519] [default] Level-0 commit table #31 started
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.354656) [db/memtable_list.cc:722] [default] Level-0 commit table #31: memtable #1 done
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.354671) EVENT_LOG_v1 {"time_micros": 1764581424354666, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.354691) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 957567, prev total WAL file size 957567, number of live WAL files 2.
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000027.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.355306) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353032' seq:0, type:0; will stop at (end)
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [31(571KB)], [29(5662KB)]
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581424355333, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [31], "files_L6": [29], "score": -1, "input_data_size": 6383370, "oldest_snapshot_seqno": -1}
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #32: 3199 keys, 4658455 bytes, temperature: kUnknown
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581424387892, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 32, "file_size": 4658455, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 4636400, "index_size": 12986, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8005, "raw_key_size": 74157, "raw_average_key_size": 23, "raw_value_size": 4578328, "raw_average_value_size": 1431, "num_data_blocks": 579, "num_entries": 3199, "num_filter_entries": 3199, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764581424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.388252) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 4658455 bytes
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.389824) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 195.1 rd, 142.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 5.5 +0.0 blob) out(4.4 +0.0 blob), read-write-amplify(18.9) write-amplify(8.0) OK, records in: 3669, records dropped: 470 output_compression: NoCompression
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.389862) EVENT_LOG_v1 {"time_micros": 1764581424389843, "job": 12, "event": "compaction_finished", "compaction_time_micros": 32720, "compaction_time_cpu_micros": 11797, "output_level": 6, "num_output_files": 1, "total_output_size": 4658455, "num_input_records": 3669, "num_output_records": 3199, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000031.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581424390187, "job": 12, "event": "table_file_deletion", "file_number": 31}
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581424392266, "job": 12, "event": "table_file_deletion", "file_number": 29}
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.355228) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.392393) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.392401) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.392404) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.392406) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:30:24 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.392409) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:30:24 np0005540741 python3.9[228954]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581423.4307067-437-18414245863872/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:30:25 np0005540741 podman[229132]: 2025-12-01 09:30:25.15547645 +0000 UTC m=+0.072945061 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  1 04:30:25 np0005540741 python3.9[229225]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:30:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v554: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:30:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:30:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:30:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:30:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:30:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:30:25 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 2869960a-b803-437d-b91f-2975596ac662 does not exist
Dec  1 04:30:25 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 94adc5e7-6c60-49e4-9a61-fe86eff64826 does not exist
Dec  1 04:30:25 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 4353b2dc-5aaa-4544-838b-4bd1eff88270 does not exist
Dec  1 04:30:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:30:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:30:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:30:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:30:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:30:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:30:26 np0005540741 python3.9[229483]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:30:26 np0005540741 podman[229591]: 2025-12-01 09:30:26.225003295 +0000 UTC m=+0.046903098 container create 80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_yonath, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Dec  1 04:30:26 np0005540741 systemd[1]: Started libpod-conmon-80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57.scope.
Dec  1 04:30:26 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:30:26 np0005540741 podman[229591]: 2025-12-01 09:30:26.197045156 +0000 UTC m=+0.018944989 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:30:26 np0005540741 podman[229591]: 2025-12-01 09:30:26.309612792 +0000 UTC m=+0.131512655 container init 80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_yonath, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec  1 04:30:26 np0005540741 podman[229591]: 2025-12-01 09:30:26.316739978 +0000 UTC m=+0.138639791 container start 80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_yonath, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:30:26 np0005540741 podman[229591]: 2025-12-01 09:30:26.320397224 +0000 UTC m=+0.142297027 container attach 80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_yonath, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:30:26 np0005540741 dreamy_yonath[229633]: 167 167
Dec  1 04:30:26 np0005540741 systemd[1]: libpod-80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57.scope: Deactivated successfully.
Dec  1 04:30:26 np0005540741 podman[229591]: 2025-12-01 09:30:26.322568207 +0000 UTC m=+0.144468010 container died 80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_yonath, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default)
Dec  1 04:30:26 np0005540741 systemd[1]: var-lib-containers-storage-overlay-f789f30a689397fb5ce232d15e68b9c5a22516a7691bbb16f1e7275c2e52d5fe-merged.mount: Deactivated successfully.
Dec  1 04:30:26 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:30:26 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:30:26 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:30:26 np0005540741 podman[229591]: 2025-12-01 09:30:26.369267577 +0000 UTC m=+0.191167380 container remove 80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_yonath, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:30:26 np0005540741 systemd[1]: libpod-conmon-80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57.scope: Deactivated successfully.
Dec  1 04:30:26 np0005540741 python3.9[229701]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581425.5845556-462-171766235447270/.source.json _original_basename=.12dpb14q follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:26 np0005540741 podman[229708]: 2025-12-01 09:30:26.570902419 +0000 UTC m=+0.037148745 container create 7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_shockley, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Dec  1 04:30:26 np0005540741 systemd[1]: Started libpod-conmon-7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d.scope.
Dec  1 04:30:26 np0005540741 podman[229708]: 2025-12-01 09:30:26.555463763 +0000 UTC m=+0.021710119 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:30:26 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:30:26 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb0acbcc2a8ed7c023da20cd142e1a53cbcb348d5bb00a6b5651d33888fe6b5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:26 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb0acbcc2a8ed7c023da20cd142e1a53cbcb348d5bb00a6b5651d33888fe6b5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:26 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb0acbcc2a8ed7c023da20cd142e1a53cbcb348d5bb00a6b5651d33888fe6b5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:26 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb0acbcc2a8ed7c023da20cd142e1a53cbcb348d5bb00a6b5651d33888fe6b5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:26 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb0acbcc2a8ed7c023da20cd142e1a53cbcb348d5bb00a6b5651d33888fe6b5/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:26 np0005540741 podman[229708]: 2025-12-01 09:30:26.680500929 +0000 UTC m=+0.146747275 container init 7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_shockley, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:30:26 np0005540741 podman[229708]: 2025-12-01 09:30:26.690443947 +0000 UTC m=+0.156690283 container start 7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_shockley, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:30:26 np0005540741 podman[229708]: 2025-12-01 09:30:26.693724192 +0000 UTC m=+0.159970548 container attach 7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_shockley, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:30:27 np0005540741 python3.9[229881]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v555: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:28 np0005540741 compassionate_shockley[229727]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:30:28 np0005540741 compassionate_shockley[229727]: --> relative data size: 1.0
Dec  1 04:30:28 np0005540741 compassionate_shockley[229727]: --> All data devices are unavailable
Dec  1 04:30:28 np0005540741 systemd[1]: libpod-7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d.scope: Deactivated successfully.
Dec  1 04:30:28 np0005540741 systemd[1]: libpod-7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d.scope: Consumed 1.122s CPU time.
Dec  1 04:30:28 np0005540741 podman[229708]: 2025-12-01 09:30:28.350392338 +0000 UTC m=+1.816638664 container died 7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_shockley, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:30:28 np0005540741 systemd[1]: var-lib-containers-storage-overlay-dbb0acbcc2a8ed7c023da20cd142e1a53cbcb348d5bb00a6b5651d33888fe6b5-merged.mount: Deactivated successfully.
Dec  1 04:30:28 np0005540741 podman[229708]: 2025-12-01 09:30:28.407725046 +0000 UTC m=+1.873971382 container remove 7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_shockley, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  1 04:30:28 np0005540741 systemd[1]: libpod-conmon-7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d.scope: Deactivated successfully.
Dec  1 04:30:29 np0005540741 podman[230360]: 2025-12-01 09:30:29.046028159 +0000 UTC m=+0.057067882 container create 2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_bell, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  1 04:30:29 np0005540741 systemd[1]: Started libpod-conmon-2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707.scope.
Dec  1 04:30:29 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:30:29 np0005540741 podman[230360]: 2025-12-01 09:30:29.027162183 +0000 UTC m=+0.038201936 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:30:29 np0005540741 podman[230360]: 2025-12-01 09:30:29.127244328 +0000 UTC m=+0.138284101 container init 2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_bell, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:30:29 np0005540741 podman[230360]: 2025-12-01 09:30:29.1356201 +0000 UTC m=+0.146659813 container start 2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_bell, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec  1 04:30:29 np0005540741 hungry_bell[230387]: 167 167
Dec  1 04:30:29 np0005540741 systemd[1]: libpod-2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707.scope: Deactivated successfully.
Dec  1 04:30:29 np0005540741 podman[230360]: 2025-12-01 09:30:29.146925107 +0000 UTC m=+0.157964870 container attach 2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_bell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec  1 04:30:29 np0005540741 podman[230360]: 2025-12-01 09:30:29.147329209 +0000 UTC m=+0.158368942 container died 2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_bell, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:30:29 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b363f101e2ecbda5ae372ebc4c66af89306a96c15a199b4c818c6951afc16c88-merged.mount: Deactivated successfully.
Dec  1 04:30:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:30:29 np0005540741 podman[230360]: 2025-12-01 09:30:29.37314326 +0000 UTC m=+0.384183013 container remove 2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_bell, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec  1 04:30:29 np0005540741 systemd[1]: libpod-conmon-2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707.scope: Deactivated successfully.
Dec  1 04:30:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v556: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:29 np0005540741 podman[230526]: 2025-12-01 09:30:29.634543491 +0000 UTC m=+0.073267240 container create ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_morse, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:30:29 np0005540741 systemd[1]: Started libpod-conmon-ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086.scope.
Dec  1 04:30:29 np0005540741 podman[230526]: 2025-12-01 09:30:29.607353494 +0000 UTC m=+0.046077273 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:30:29 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:30:29 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b422c0fe7471334c588dc443263eb3ab6f40a0457d5d81a8f3b0f0989adef1c3/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:29 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b422c0fe7471334c588dc443263eb3ab6f40a0457d5d81a8f3b0f0989adef1c3/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:29 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b422c0fe7471334c588dc443263eb3ab6f40a0457d5d81a8f3b0f0989adef1c3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:29 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b422c0fe7471334c588dc443263eb3ab6f40a0457d5d81a8f3b0f0989adef1c3/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:29 np0005540741 podman[230526]: 2025-12-01 09:30:29.730740803 +0000 UTC m=+0.169464532 container init ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_morse, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:30:29 np0005540741 podman[230526]: 2025-12-01 09:30:29.739909898 +0000 UTC m=+0.178633617 container start ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_morse, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:30:29 np0005540741 podman[230526]: 2025-12-01 09:30:29.743011838 +0000 UTC m=+0.181735577 container attach ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_morse, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Dec  1 04:30:29 np0005540741 python3.9[230541]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec  1 04:30:30 np0005540741 priceless_morse[230548]: {
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:    "0": [
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:        {
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "devices": [
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "/dev/loop3"
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            ],
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "lv_name": "ceph_lv0",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "lv_size": "21470642176",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "name": "ceph_lv0",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "tags": {
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.cluster_name": "ceph",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.crush_device_class": "",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.encrypted": "0",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.osd_id": "0",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.type": "block",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.vdo": "0"
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            },
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "type": "block",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "vg_name": "ceph_vg0"
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:        }
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:    ],
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:    "1": [
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:        {
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "devices": [
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "/dev/loop4"
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            ],
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "lv_name": "ceph_lv1",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "lv_size": "21470642176",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "name": "ceph_lv1",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "tags": {
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.cluster_name": "ceph",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.crush_device_class": "",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.encrypted": "0",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.osd_id": "1",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.type": "block",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.vdo": "0"
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            },
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "type": "block",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "vg_name": "ceph_vg1"
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:        }
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:    ],
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:    "2": [
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:        {
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "devices": [
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "/dev/loop5"
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            ],
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "lv_name": "ceph_lv2",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "lv_size": "21470642176",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "name": "ceph_lv2",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "tags": {
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.cluster_name": "ceph",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.crush_device_class": "",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.encrypted": "0",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.osd_id": "2",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.type": "block",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:                "ceph.vdo": "0"
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            },
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "type": "block",
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:            "vg_name": "ceph_vg2"
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:        }
Dec  1 04:30:30 np0005540741 priceless_morse[230548]:    ]
Dec  1 04:30:30 np0005540741 priceless_morse[230548]: }
Dec  1 04:30:30 np0005540741 systemd[1]: libpod-ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086.scope: Deactivated successfully.
Dec  1 04:30:30 np0005540741 podman[230526]: 2025-12-01 09:30:30.530050732 +0000 UTC m=+0.968774461 container died ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_morse, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:30:30 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b422c0fe7471334c588dc443263eb3ab6f40a0457d5d81a8f3b0f0989adef1c3-merged.mount: Deactivated successfully.
Dec  1 04:30:30 np0005540741 podman[230526]: 2025-12-01 09:30:30.588249056 +0000 UTC m=+1.026972795 container remove ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_morse, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec  1 04:30:30 np0005540741 systemd[1]: libpod-conmon-ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086.scope: Deactivated successfully.
Dec  1 04:30:30 np0005540741 python3.9[230708]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  1 04:30:31 np0005540741 podman[230937]: 2025-12-01 09:30:31.225816186 +0000 UTC m=+0.045984231 container create 74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_franklin, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec  1 04:30:31 np0005540741 systemd[1]: Started libpod-conmon-74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f.scope.
Dec  1 04:30:31 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:30:31 np0005540741 podman[230937]: 2025-12-01 09:30:31.202175922 +0000 UTC m=+0.022343997 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:30:31 np0005540741 podman[230937]: 2025-12-01 09:30:31.307815677 +0000 UTC m=+0.127983712 container init 74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_franklin, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True)
Dec  1 04:30:31 np0005540741 podman[230937]: 2025-12-01 09:30:31.314926763 +0000 UTC m=+0.135094768 container start 74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_franklin, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True)
Dec  1 04:30:31 np0005540741 podman[230937]: 2025-12-01 09:30:31.31791989 +0000 UTC m=+0.138087925 container attach 74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_franklin, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:30:31 np0005540741 great_franklin[230974]: 167 167
Dec  1 04:30:31 np0005540741 systemd[1]: libpod-74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f.scope: Deactivated successfully.
Dec  1 04:30:31 np0005540741 podman[230937]: 2025-12-01 09:30:31.32310851 +0000 UTC m=+0.143276525 container died 74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_franklin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec  1 04:30:31 np0005540741 systemd[1]: var-lib-containers-storage-overlay-5becdfe9ef8c9309a117519f59f92ba06f6a99d6f64f24bac130d3fc2cf7d6d6-merged.mount: Deactivated successfully.
Dec  1 04:30:31 np0005540741 podman[230937]: 2025-12-01 09:30:31.37047857 +0000 UTC m=+0.190646585 container remove 74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_franklin, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  1 04:30:31 np0005540741 systemd[1]: libpod-conmon-74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f.scope: Deactivated successfully.
Dec  1 04:30:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v557: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:31 np0005540741 podman[231053]: 2025-12-01 09:30:31.568370354 +0000 UTC m=+0.046301971 container create f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shirley, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec  1 04:30:31 np0005540741 systemd[1]: Started libpod-conmon-f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559.scope.
Dec  1 04:30:31 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:30:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaeca87c70200b39d0b478696a395d2839d0740104be83dd09f8e55f0538b77d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaeca87c70200b39d0b478696a395d2839d0740104be83dd09f8e55f0538b77d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaeca87c70200b39d0b478696a395d2839d0740104be83dd09f8e55f0538b77d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:31 np0005540741 podman[231053]: 2025-12-01 09:30:31.54958751 +0000 UTC m=+0.027519147 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:30:31 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaeca87c70200b39d0b478696a395d2839d0740104be83dd09f8e55f0538b77d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:31 np0005540741 podman[231053]: 2025-12-01 09:30:31.657982516 +0000 UTC m=+0.135914163 container init f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shirley, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:30:31 np0005540741 podman[231053]: 2025-12-01 09:30:31.666268395 +0000 UTC m=+0.144200012 container start f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shirley, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  1 04:30:31 np0005540741 podman[231053]: 2025-12-01 09:30:31.669819348 +0000 UTC m=+0.147750985 container attach f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shirley, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:30:31 np0005540741 python3.9[231048]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]: {
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:        "osd_id": 0,
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:        "type": "bluestore"
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:    },
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:        "osd_id": 1,
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:        "type": "bluestore"
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:    },
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:        "osd_id": 2,
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:        "type": "bluestore"
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]:    }
Dec  1 04:30:32 np0005540741 sleepy_shirley[231070]: }
Dec  1 04:30:32 np0005540741 systemd[1]: libpod-f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559.scope: Deactivated successfully.
Dec  1 04:30:32 np0005540741 systemd[1]: libpod-f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559.scope: Consumed 1.090s CPU time.
Dec  1 04:30:32 np0005540741 podman[231053]: 2025-12-01 09:30:32.754496571 +0000 UTC m=+1.232428218 container died f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shirley, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec  1 04:30:32 np0005540741 systemd[1]: var-lib-containers-storage-overlay-eaeca87c70200b39d0b478696a395d2839d0740104be83dd09f8e55f0538b77d-merged.mount: Deactivated successfully.
Dec  1 04:30:32 np0005540741 podman[231053]: 2025-12-01 09:30:32.819885342 +0000 UTC m=+1.297816959 container remove f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shirley, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:30:32 np0005540741 systemd[1]: libpod-conmon-f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559.scope: Deactivated successfully.
Dec  1 04:30:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:30:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:30:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:30:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:30:32 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 19ebf54b-18f2-4d6a-9403-9db0675f9a39 does not exist
Dec  1 04:30:33 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:30:33 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:30:33 np0005540741 python3[231339]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  1 04:30:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v558: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:30:34 np0005540741 podman[231354]: 2025-12-01 09:30:34.690601679 +0000 UTC m=+1.318458484 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  1 04:30:34 np0005540741 podman[231414]: 2025-12-01 09:30:34.82753053 +0000 UTC m=+0.030451972 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  1 04:30:35 np0005540741 podman[231414]: 2025-12-01 09:30:35.056086411 +0000 UTC m=+0.259007793 container create 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Dec  1 04:30:35 np0005540741 python3[231339]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  1 04:30:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v559: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:35 np0005540741 python3.9[231603]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:30:36 np0005540741 python3.9[231757]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:37 np0005540741 python3.9[231833]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:30:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v560: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:37 np0005540741 python3.9[231984]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764581437.1419468-550-125027055710573/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:38 np0005540741 python3.9[232060]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 04:30:38 np0005540741 systemd[1]: Reloading.
Dec  1 04:30:38 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:30:38 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:30:39 np0005540741 python3.9[232171]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:30:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:30:39 np0005540741 systemd[1]: Reloading.
Dec  1 04:30:39 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:30:39 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:30:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v561: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:39 np0005540741 systemd[1]: Starting multipathd container...
Dec  1 04:30:39 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:30:39 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50a1c402b0115b7d9031e9416a904b5d23245fa1d19d3ce51f831e7ed4fa7b81/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:39 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50a1c402b0115b7d9031e9416a904b5d23245fa1d19d3ce51f831e7ed4fa7b81/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:39 np0005540741 systemd[1]: Started /usr/bin/podman healthcheck run 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d.
Dec  1 04:30:39 np0005540741 podman[232210]: 2025-12-01 09:30:39.873874049 +0000 UTC m=+0.130296100 container init 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Dec  1 04:30:39 np0005540741 multipathd[232225]: + sudo -E kolla_set_configs
Dec  1 04:30:39 np0005540741 podman[232210]: 2025-12-01 09:30:39.911408854 +0000 UTC m=+0.167830885 container start 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0)
Dec  1 04:30:39 np0005540741 podman[232210]: multipathd
Dec  1 04:30:39 np0005540741 systemd[1]: Started multipathd container.
Dec  1 04:30:39 np0005540741 multipathd[232225]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 04:30:39 np0005540741 multipathd[232225]: INFO:__main__:Validating config file
Dec  1 04:30:39 np0005540741 multipathd[232225]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 04:30:39 np0005540741 multipathd[232225]: INFO:__main__:Writing out command to execute
Dec  1 04:30:39 np0005540741 multipathd[232225]: ++ cat /run_command
Dec  1 04:30:39 np0005540741 multipathd[232225]: + CMD='/usr/sbin/multipathd -d'
Dec  1 04:30:39 np0005540741 multipathd[232225]: + ARGS=
Dec  1 04:30:39 np0005540741 multipathd[232225]: + sudo kolla_copy_cacerts
Dec  1 04:30:39 np0005540741 podman[232231]: 2025-12-01 09:30:39.992912642 +0000 UTC m=+0.072168259 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 04:30:39 np0005540741 multipathd[232225]: Running command: '/usr/sbin/multipathd -d'
Dec  1 04:30:39 np0005540741 multipathd[232225]: + [[ ! -n '' ]]
Dec  1 04:30:39 np0005540741 multipathd[232225]: + . kolla_extend_start
Dec  1 04:30:39 np0005540741 multipathd[232225]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  1 04:30:39 np0005540741 multipathd[232225]: + umask 0022
Dec  1 04:30:39 np0005540741 multipathd[232225]: + exec /usr/sbin/multipathd -d
Dec  1 04:30:40 np0005540741 systemd[1]: 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d-60c59a972f790ebc.service: Main process exited, code=exited, status=1/FAILURE
Dec  1 04:30:40 np0005540741 systemd[1]: 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d-60c59a972f790ebc.service: Failed with result 'exit-code'.
Dec  1 04:30:40 np0005540741 multipathd[232225]: 3284.985264 | --------start up--------
Dec  1 04:30:40 np0005540741 multipathd[232225]: 3284.985287 | read /etc/multipath.conf
Dec  1 04:30:40 np0005540741 multipathd[232225]: 3284.992063 | path checkers start up
Dec  1 04:30:40 np0005540741 python3.9[232414]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:30:41 np0005540741 python3.9[232568]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:30:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v562: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:42 np0005540741 python3.9[232733]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:30:42 np0005540741 systemd[1]: Stopping multipathd container...
Dec  1 04:30:42 np0005540741 multipathd[232225]: 3287.407905 | exit (signal)
Dec  1 04:30:42 np0005540741 multipathd[232225]: 3287.407990 | --------shut down-------
Dec  1 04:30:42 np0005540741 systemd[1]: libpod-832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d.scope: Deactivated successfully.
Dec  1 04:30:42 np0005540741 podman[232737]: 2025-12-01 09:30:42.469557165 +0000 UTC m=+0.074051352 container died 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  1 04:30:42 np0005540741 systemd[1]: 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d-60c59a972f790ebc.timer: Deactivated successfully.
Dec  1 04:30:42 np0005540741 systemd[1]: Stopped /usr/bin/podman healthcheck run 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d.
Dec  1 04:30:42 np0005540741 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d-userdata-shm.mount: Deactivated successfully.
Dec  1 04:30:42 np0005540741 systemd[1]: var-lib-containers-storage-overlay-50a1c402b0115b7d9031e9416a904b5d23245fa1d19d3ce51f831e7ed4fa7b81-merged.mount: Deactivated successfully.
Dec  1 04:30:42 np0005540741 podman[232737]: 2025-12-01 09:30:42.650628616 +0000 UTC m=+0.255122823 container cleanup 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd)
Dec  1 04:30:42 np0005540741 podman[232737]: multipathd
Dec  1 04:30:42 np0005540741 podman[232764]: multipathd
Dec  1 04:30:42 np0005540741 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec  1 04:30:42 np0005540741 systemd[1]: Stopped multipathd container.
Dec  1 04:30:42 np0005540741 systemd[1]: Starting multipathd container...
Dec  1 04:30:42 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:30:42 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50a1c402b0115b7d9031e9416a904b5d23245fa1d19d3ce51f831e7ed4fa7b81/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:42 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50a1c402b0115b7d9031e9416a904b5d23245fa1d19d3ce51f831e7ed4fa7b81/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  1 04:30:42 np0005540741 systemd[1]: Started /usr/bin/podman healthcheck run 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d.
Dec  1 04:30:42 np0005540741 podman[232777]: 2025-12-01 09:30:42.897658654 +0000 UTC m=+0.142157372 container init 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 04:30:42 np0005540741 multipathd[232792]: + sudo -E kolla_set_configs
Dec  1 04:30:42 np0005540741 podman[232777]: 2025-12-01 09:30:42.937897422 +0000 UTC m=+0.182396070 container start 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  1 04:30:42 np0005540741 podman[232777]: multipathd
Dec  1 04:30:42 np0005540741 systemd[1]: Started multipathd container.
Dec  1 04:30:42 np0005540741 multipathd[232792]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 04:30:42 np0005540741 multipathd[232792]: INFO:__main__:Validating config file
Dec  1 04:30:42 np0005540741 multipathd[232792]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 04:30:42 np0005540741 multipathd[232792]: INFO:__main__:Writing out command to execute
Dec  1 04:30:43 np0005540741 multipathd[232792]: ++ cat /run_command
Dec  1 04:30:43 np0005540741 multipathd[232792]: + CMD='/usr/sbin/multipathd -d'
Dec  1 04:30:43 np0005540741 multipathd[232792]: + ARGS=
Dec  1 04:30:43 np0005540741 multipathd[232792]: + sudo kolla_copy_cacerts
Dec  1 04:30:43 np0005540741 multipathd[232792]: + [[ ! -n '' ]]
Dec  1 04:30:43 np0005540741 multipathd[232792]: + . kolla_extend_start
Dec  1 04:30:43 np0005540741 multipathd[232792]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  1 04:30:43 np0005540741 multipathd[232792]: Running command: '/usr/sbin/multipathd -d'
Dec  1 04:30:43 np0005540741 multipathd[232792]: + umask 0022
Dec  1 04:30:43 np0005540741 multipathd[232792]: + exec /usr/sbin/multipathd -d
Dec  1 04:30:43 np0005540741 podman[232799]: 2025-12-01 09:30:43.044195821 +0000 UTC m=+0.089674462 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  1 04:30:43 np0005540741 multipathd[232792]: 3288.014249 | --------start up--------
Dec  1 04:30:43 np0005540741 multipathd[232792]: 3288.014413 | read /etc/multipath.conf
Dec  1 04:30:43 np0005540741 multipathd[232792]: 3288.020230 | path checkers start up
Dec  1 04:30:43 np0005540741 systemd[1]: 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d-64c3927ea1fd61c3.service: Main process exited, code=exited, status=1/FAILURE
Dec  1 04:30:43 np0005540741 systemd[1]: 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d-64c3927ea1fd61c3.service: Failed with result 'exit-code'.
Dec  1 04:30:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:30:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:30:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:30:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:30:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:30:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:30:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v563: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:43 np0005540741 python3.9[232984]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:30:44 np0005540741 python3.9[233136]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  1 04:30:45 np0005540741 python3.9[233288]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec  1 04:30:45 np0005540741 kernel: Key type psk registered
Dec  1 04:30:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v564: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:46 np0005540741 python3.9[233451]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:30:46 np0005540741 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec  1 04:30:46 np0005540741 python3.9[233575]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581445.567507-630-248121160340352/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:47 np0005540741 python3.9[233727]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v565: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:47 np0005540741 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  1 04:30:48 np0005540741 python3.9[233880]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:30:48 np0005540741 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  1 04:30:48 np0005540741 systemd[1]: Stopped Load Kernel Modules.
Dec  1 04:30:48 np0005540741 systemd[1]: Stopping Load Kernel Modules...
Dec  1 04:30:48 np0005540741 systemd[1]: Starting Load Kernel Modules...
Dec  1 04:30:48 np0005540741 systemd[1]: Finished Load Kernel Modules.
Dec  1 04:30:49 np0005540741 python3.9[234036]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:30:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:30:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v566: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v567: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:51 np0005540741 systemd[1]: Reloading.
Dec  1 04:30:51 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:30:51 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:30:51 np0005540741 podman[234043]: 2025-12-01 09:30:51.82774614 +0000 UTC m=+0.105981581 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec  1 04:30:52 np0005540741 systemd[1]: Reloading.
Dec  1 04:30:52 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:30:52 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:30:52 np0005540741 systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  1 04:30:52 np0005540741 systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  1 04:30:52 np0005540741 lvm[234179]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec  1 04:30:52 np0005540741 lvm[234179]: VG ceph_vg1 finished
Dec  1 04:30:52 np0005540741 lvm[234178]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:30:52 np0005540741 lvm[234178]: VG ceph_vg0 finished
Dec  1 04:30:52 np0005540741 lvm[234180]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec  1 04:30:52 np0005540741 lvm[234180]: VG ceph_vg2 finished
Dec  1 04:30:52 np0005540741 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:30:52 np0005540741 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:30:52 np0005540741 systemd[1]: Reloading.
Dec  1 04:30:52 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:30:52 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:30:53 np0005540741 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:30:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v568: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:54 np0005540741 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:30:54 np0005540741 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:30:54 np0005540741 systemd[1]: man-db-cache-update.service: Consumed 1.744s CPU time.
Dec  1 04:30:54 np0005540741 systemd[1]: run-r2f03deedab9b48618c1677bc4cb20a2a.service: Deactivated successfully.
Dec  1 04:30:54 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:30:54 np0005540741 python3.9[235500]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:30:54 np0005540741 iscsid[222973]: iscsid shutting down.
Dec  1 04:30:54 np0005540741 systemd[1]: Stopping Open-iSCSI...
Dec  1 04:30:54 np0005540741 systemd[1]: iscsid.service: Deactivated successfully.
Dec  1 04:30:54 np0005540741 systemd[1]: Stopped Open-iSCSI.
Dec  1 04:30:54 np0005540741 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  1 04:30:54 np0005540741 systemd[1]: Starting Open-iSCSI...
Dec  1 04:30:54 np0005540741 systemd[1]: Started Open-iSCSI.
Dec  1 04:30:55 np0005540741 python3.9[235678]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:30:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v569: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:55 np0005540741 podman[235728]: 2025-12-01 09:30:55.964617118 +0000 UTC m=+0.067122852 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  1 04:30:56 np0005540741 python3.9[235854]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:30:57 np0005540741 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec  1 04:30:57 np0005540741 systemd[1]: virtqemud.service: Deactivated successfully.
Dec  1 04:30:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v570: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:30:57 np0005540741 python3.9[236008]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 04:30:57 np0005540741 systemd[1]: Reloading.
Dec  1 04:30:57 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:30:57 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:30:58 np0005540741 python3.9[236193]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:30:58 np0005540741 network[236210]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:30:58 np0005540741 network[236211]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:30:58 np0005540741 network[236212]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:30:59 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:30:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v571: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v572: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:02 np0005540741 python3.9[236489]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:31:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v573: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:03 np0005540741 python3.9[236642]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:31:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:31:04 np0005540741 python3.9[236795]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:31:05 np0005540741 python3.9[236948]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:31:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v574: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:06 np0005540741 python3.9[237101]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:31:06 np0005540741 python3.9[237254]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:31:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v575: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:07 np0005540741 python3.9[237407]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:31:08 np0005540741 python3.9[237560]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:31:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:31:09 np0005540741 python3.9[237713]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v576: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:10 np0005540741 python3.9[237865]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:10 np0005540741 python3.9[238017]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:11 np0005540741 python3.9[238169]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v577: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:12 np0005540741 python3.9[238321]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:12 np0005540741 python3.9[238473]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:31:13
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['volumes', 'backups', 'images', 'vms', '.mgr', 'cephfs.cephfs.data', 'cephfs.cephfs.meta']
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:31:13 np0005540741 podman[238597]: 2025-12-01 09:31:13.285082909 +0000 UTC m=+0.068328747 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  1 04:31:13 np0005540741 python3.9[238643]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v578: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:14 np0005540741 python3.9[238798]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:31:14 np0005540741 python3.9[238950]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v579: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:15 np0005540741 python3.9[239102]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:16 np0005540741 python3.9[239254]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:16 np0005540741 python3.9[239406]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v580: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:17 np0005540741 python3.9[239558]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:18 np0005540741 python3.9[239710]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:31:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:31:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:31:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:31:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:31:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:31:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:31:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:31:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:31:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:31:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:31:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:31:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:31:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:31:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:31:19 np0005540741 python3.9[239862]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:31:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v581: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:19 np0005540741 python3.9[240014]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:31:20.464 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:31:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:31:20.465 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:31:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:31:20.465 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:31:20 np0005540741 python3.9[240166]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:31:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v582: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:21 np0005540741 python3.9[240318]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 04:31:22 np0005540741 podman[240442]: 2025-12-01 09:31:22.400367523 +0000 UTC m=+0.097825596 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 04:31:22 np0005540741 python3.9[240490]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 04:31:22 np0005540741 systemd[1]: Reloading.
Dec  1 04:31:22 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:31:22 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:31:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v583: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:23 np0005540741 python3.9[240684]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:31:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:31:24 np0005540741 python3.9[240837]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:31:24 np0005540741 python3.9[240990]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:31:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v584: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:25 np0005540741 python3.9[241143]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:31:26 np0005540741 podman[241296]: 2025-12-01 09:31:26.112489761 +0000 UTC m=+0.057417433 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  1 04:31:26 np0005540741 python3.9[241297]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:31:26 np0005540741 python3.9[241467]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:31:27 np0005540741 python3.9[241620]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:31:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v585: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:28 np0005540741 python3.9[241773]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:31:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:31:29 np0005540741 python3.9[241926]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v586: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:30 np0005540741 python3.9[242078]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:30 np0005540741 python3.9[242230]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:31 np0005540741 python3.9[242382]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v587: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:31 np0005540741 python3.9[242534]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:32 np0005540741 python3.9[242686]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:33 np0005540741 python3.9[242838]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v588: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:31:33 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:31:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:31:33 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:31:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:31:33 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:31:33 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 328781bf-619a-4e2d-8362-c884c9d853b4 does not exist
Dec  1 04:31:33 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 7a0e186b-c5ce-44cc-834b-1fecd670fe7f does not exist
Dec  1 04:31:33 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev b3cf4ed0-5a30-40a7-bf27-a6f2cbd2875c does not exist
Dec  1 04:31:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:31:33 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:31:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:31:33 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:31:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:31:33 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:31:33 np0005540741 python3.9[243109]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:34 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:31:34 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:31:34 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:31:34 np0005540741 podman[243415]: 2025-12-01 09:31:34.298270658 +0000 UTC m=+0.038488178 container create 7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_kalam, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 04:31:34 np0005540741 systemd[1]: Started libpod-conmon-7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b.scope.
Dec  1 04:31:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:31:34 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:31:34 np0005540741 podman[243415]: 2025-12-01 09:31:34.28165245 +0000 UTC m=+0.021870000 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:31:34 np0005540741 podman[243415]: 2025-12-01 09:31:34.385164029 +0000 UTC m=+0.125381599 container init 7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_kalam, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:31:34 np0005540741 podman[243415]: 2025-12-01 09:31:34.394039174 +0000 UTC m=+0.134256714 container start 7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_kalam, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:31:34 np0005540741 podman[243415]: 2025-12-01 09:31:34.399301675 +0000 UTC m=+0.139519235 container attach 7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_kalam, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:31:34 np0005540741 keen_kalam[243431]: 167 167
Dec  1 04:31:34 np0005540741 systemd[1]: libpod-7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b.scope: Deactivated successfully.
Dec  1 04:31:34 np0005540741 conmon[243431]: conmon 7cbd28e6f58db1cec8e5 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b.scope/container/memory.events
Dec  1 04:31:34 np0005540741 podman[243415]: 2025-12-01 09:31:34.401481728 +0000 UTC m=+0.141699268 container died 7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_kalam, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:31:34 np0005540741 systemd[1]: var-lib-containers-storage-overlay-7997302c7d1ad48698deb4da87f2f6ac9fb2b98f52059da05c581182aaa5dc3d-merged.mount: Deactivated successfully.
Dec  1 04:31:34 np0005540741 python3.9[243407]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:34 np0005540741 podman[243415]: 2025-12-01 09:31:34.446649178 +0000 UTC m=+0.186866698 container remove 7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_kalam, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:31:34 np0005540741 systemd[1]: libpod-conmon-7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b.scope: Deactivated successfully.
Dec  1 04:31:34 np0005540741 podman[243486]: 2025-12-01 09:31:34.622633212 +0000 UTC m=+0.043795201 container create 62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_pasteur, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec  1 04:31:34 np0005540741 systemd[1]: Started libpod-conmon-62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867.scope.
Dec  1 04:31:34 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:31:34 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdd467dc8298df01010ecb1bfa1fc253fb485e024864cd815d6107c68b255822/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:31:34 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdd467dc8298df01010ecb1bfa1fc253fb485e024864cd815d6107c68b255822/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:31:34 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdd467dc8298df01010ecb1bfa1fc253fb485e024864cd815d6107c68b255822/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:31:34 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdd467dc8298df01010ecb1bfa1fc253fb485e024864cd815d6107c68b255822/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:31:34 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdd467dc8298df01010ecb1bfa1fc253fb485e024864cd815d6107c68b255822/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:31:34 np0005540741 podman[243486]: 2025-12-01 09:31:34.6037936 +0000 UTC m=+0.024955619 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:31:34 np0005540741 podman[243486]: 2025-12-01 09:31:34.709078589 +0000 UTC m=+0.130240668 container init 62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_pasteur, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec  1 04:31:34 np0005540741 podman[243486]: 2025-12-01 09:31:34.717589304 +0000 UTC m=+0.138751303 container start 62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_pasteur, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec  1 04:31:34 np0005540741 podman[243486]: 2025-12-01 09:31:34.721707673 +0000 UTC m=+0.142869692 container attach 62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_pasteur, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:31:35 np0005540741 python3.9[243627]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v589: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:35 np0005540741 elegant_pasteur[243547]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:31:35 np0005540741 elegant_pasteur[243547]: --> relative data size: 1.0
Dec  1 04:31:35 np0005540741 elegant_pasteur[243547]: --> All data devices are unavailable
Dec  1 04:31:35 np0005540741 systemd[1]: libpod-62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867.scope: Deactivated successfully.
Dec  1 04:31:35 np0005540741 podman[243486]: 2025-12-01 09:31:35.753159052 +0000 UTC m=+1.174321051 container died 62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_pasteur, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec  1 04:31:35 np0005540741 systemd[1]: libpod-62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867.scope: Consumed 1.000s CPU time.
Dec  1 04:31:35 np0005540741 systemd[1]: var-lib-containers-storage-overlay-bdd467dc8298df01010ecb1bfa1fc253fb485e024864cd815d6107c68b255822-merged.mount: Deactivated successfully.
Dec  1 04:31:35 np0005540741 podman[243486]: 2025-12-01 09:31:35.804582082 +0000 UTC m=+1.225744091 container remove 62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_pasteur, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:31:35 np0005540741 systemd[1]: libpod-conmon-62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867.scope: Deactivated successfully.
Dec  1 04:31:36 np0005540741 podman[243829]: 2025-12-01 09:31:36.450054296 +0000 UTC m=+0.041522806 container create 5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_wiles, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec  1 04:31:36 np0005540741 systemd[1]: Started libpod-conmon-5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29.scope.
Dec  1 04:31:36 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:31:36 np0005540741 podman[243829]: 2025-12-01 09:31:36.521482371 +0000 UTC m=+0.112950901 container init 5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_wiles, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef)
Dec  1 04:31:36 np0005540741 podman[243829]: 2025-12-01 09:31:36.529350898 +0000 UTC m=+0.120819408 container start 5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_wiles, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True)
Dec  1 04:31:36 np0005540741 podman[243829]: 2025-12-01 09:31:36.433441188 +0000 UTC m=+0.024909718 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:31:36 np0005540741 podman[243829]: 2025-12-01 09:31:36.532982712 +0000 UTC m=+0.124451252 container attach 5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_wiles, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec  1 04:31:36 np0005540741 gifted_wiles[243845]: 167 167
Dec  1 04:31:36 np0005540741 systemd[1]: libpod-5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29.scope: Deactivated successfully.
Dec  1 04:31:36 np0005540741 podman[243829]: 2025-12-01 09:31:36.535041001 +0000 UTC m=+0.126509511 container died 5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_wiles, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:31:36 np0005540741 systemd[1]: var-lib-containers-storage-overlay-75f41480405acf7135bdc71fdb024b4069c4c85e95081da9febcd6c27170516f-merged.mount: Deactivated successfully.
Dec  1 04:31:36 np0005540741 podman[243829]: 2025-12-01 09:31:36.571720727 +0000 UTC m=+0.163189237 container remove 5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_wiles, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec  1 04:31:36 np0005540741 systemd[1]: libpod-conmon-5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29.scope: Deactivated successfully.
Dec  1 04:31:36 np0005540741 podman[243869]: 2025-12-01 09:31:36.744804587 +0000 UTC m=+0.043399029 container create a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_ganguly, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec  1 04:31:36 np0005540741 systemd[1]: Started libpod-conmon-a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e.scope.
Dec  1 04:31:36 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:31:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42c25d8d5f2c358e4a5459c59cbe467416dafe9ac9857ba4f4554c46a131245d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:31:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42c25d8d5f2c358e4a5459c59cbe467416dafe9ac9857ba4f4554c46a131245d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:31:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42c25d8d5f2c358e4a5459c59cbe467416dafe9ac9857ba4f4554c46a131245d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:31:36 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42c25d8d5f2c358e4a5459c59cbe467416dafe9ac9857ba4f4554c46a131245d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:31:36 np0005540741 podman[243869]: 2025-12-01 09:31:36.727063407 +0000 UTC m=+0.025657879 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:31:36 np0005540741 podman[243869]: 2025-12-01 09:31:36.829155815 +0000 UTC m=+0.127750257 container init a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_ganguly, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:31:36 np0005540741 podman[243869]: 2025-12-01 09:31:36.835961171 +0000 UTC m=+0.134555613 container start a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_ganguly, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef)
Dec  1 04:31:36 np0005540741 podman[243869]: 2025-12-01 09:31:36.838755961 +0000 UTC m=+0.137350403 container attach a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_ganguly, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec  1 04:31:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v590: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]: {
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:    "0": [
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:        {
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "devices": [
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "/dev/loop3"
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            ],
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "lv_name": "ceph_lv0",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "lv_size": "21470642176",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "name": "ceph_lv0",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "tags": {
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.cluster_name": "ceph",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.crush_device_class": "",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.encrypted": "0",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.osd_id": "0",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.type": "block",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.vdo": "0"
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            },
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "type": "block",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "vg_name": "ceph_vg0"
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:        }
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:    ],
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:    "1": [
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:        {
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "devices": [
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "/dev/loop4"
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            ],
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "lv_name": "ceph_lv1",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "lv_size": "21470642176",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "name": "ceph_lv1",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "tags": {
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.cluster_name": "ceph",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.crush_device_class": "",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.encrypted": "0",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.osd_id": "1",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.type": "block",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.vdo": "0"
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            },
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "type": "block",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "vg_name": "ceph_vg1"
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:        }
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:    ],
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:    "2": [
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:        {
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "devices": [
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "/dev/loop5"
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            ],
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "lv_name": "ceph_lv2",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "lv_size": "21470642176",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "name": "ceph_lv2",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "tags": {
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.cluster_name": "ceph",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.crush_device_class": "",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.encrypted": "0",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.osd_id": "2",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.type": "block",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:                "ceph.vdo": "0"
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            },
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "type": "block",
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:            "vg_name": "ceph_vg2"
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:        }
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]:    ]
Dec  1 04:31:37 np0005540741 kind_ganguly[243886]: }
Dec  1 04:31:37 np0005540741 systemd[1]: libpod-a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e.scope: Deactivated successfully.
Dec  1 04:31:37 np0005540741 podman[243869]: 2025-12-01 09:31:37.617903361 +0000 UTC m=+0.916497813 container died a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_ganguly, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:31:37 np0005540741 systemd[1]: var-lib-containers-storage-overlay-42c25d8d5f2c358e4a5459c59cbe467416dafe9ac9857ba4f4554c46a131245d-merged.mount: Deactivated successfully.
Dec  1 04:31:37 np0005540741 podman[243869]: 2025-12-01 09:31:37.673026277 +0000 UTC m=+0.971620709 container remove a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_ganguly, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec  1 04:31:37 np0005540741 systemd[1]: libpod-conmon-a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e.scope: Deactivated successfully.
Dec  1 04:31:38 np0005540741 podman[244047]: 2025-12-01 09:31:38.346462946 +0000 UTC m=+0.060837842 container create 8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:31:38 np0005540741 systemd[1]: Started libpod-conmon-8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01.scope.
Dec  1 04:31:38 np0005540741 podman[244047]: 2025-12-01 09:31:38.324743191 +0000 UTC m=+0.039118077 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:31:38 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:31:38 np0005540741 podman[244047]: 2025-12-01 09:31:38.442660934 +0000 UTC m=+0.157035810 container init 8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default)
Dec  1 04:31:38 np0005540741 podman[244047]: 2025-12-01 09:31:38.452432035 +0000 UTC m=+0.166806911 container start 8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec  1 04:31:38 np0005540741 podman[244047]: 2025-12-01 09:31:38.456354778 +0000 UTC m=+0.170729634 container attach 8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  1 04:31:38 np0005540741 sweet_jennings[244063]: 167 167
Dec  1 04:31:38 np0005540741 systemd[1]: libpod-8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01.scope: Deactivated successfully.
Dec  1 04:31:38 np0005540741 podman[244047]: 2025-12-01 09:31:38.462166715 +0000 UTC m=+0.176541661 container died 8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec  1 04:31:38 np0005540741 systemd[1]: var-lib-containers-storage-overlay-2956d7cffda573b5985a5904a098e3b398e7c8c256fe2432238256910069cdd1-merged.mount: Deactivated successfully.
Dec  1 04:31:38 np0005540741 podman[244047]: 2025-12-01 09:31:38.526367353 +0000 UTC m=+0.240742249 container remove 8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec  1 04:31:38 np0005540741 systemd[1]: libpod-conmon-8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01.scope: Deactivated successfully.
Dec  1 04:31:38 np0005540741 podman[244085]: 2025-12-01 09:31:38.722684672 +0000 UTC m=+0.056795205 container create 0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_euler, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  1 04:31:38 np0005540741 systemd[1]: Started libpod-conmon-0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed.scope.
Dec  1 04:31:38 np0005540741 podman[244085]: 2025-12-01 09:31:38.691460743 +0000 UTC m=+0.025571336 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:31:38 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:31:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a9ab0315d09837ebcd9ce95ff055860f90f4629967c7a6832f9b14e55d9a95c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:31:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a9ab0315d09837ebcd9ce95ff055860f90f4629967c7a6832f9b14e55d9a95c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:31:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a9ab0315d09837ebcd9ce95ff055860f90f4629967c7a6832f9b14e55d9a95c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:31:38 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a9ab0315d09837ebcd9ce95ff055860f90f4629967c7a6832f9b14e55d9a95c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:31:38 np0005540741 podman[244085]: 2025-12-01 09:31:38.822117563 +0000 UTC m=+0.156228056 container init 0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_euler, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec  1 04:31:38 np0005540741 podman[244085]: 2025-12-01 09:31:38.835321663 +0000 UTC m=+0.169432156 container start 0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_euler, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:31:38 np0005540741 podman[244085]: 2025-12-01 09:31:38.838161035 +0000 UTC m=+0.172271548 container attach 0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_euler, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:31:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:31:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v591: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]: {
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:        "osd_id": 0,
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:        "type": "bluestore"
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:    },
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:        "osd_id": 1,
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:        "type": "bluestore"
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:    },
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:        "osd_id": 2,
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:        "type": "bluestore"
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]:    }
Dec  1 04:31:39 np0005540741 intelligent_euler[244102]: }
Dec  1 04:31:39 np0005540741 systemd[1]: libpod-0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed.scope: Deactivated successfully.
Dec  1 04:31:39 np0005540741 podman[244135]: 2025-12-01 09:31:39.849552847 +0000 UTC m=+0.029486089 container died 0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_euler, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True)
Dec  1 04:31:39 np0005540741 systemd[1]: var-lib-containers-storage-overlay-8a9ab0315d09837ebcd9ce95ff055860f90f4629967c7a6832f9b14e55d9a95c-merged.mount: Deactivated successfully.
Dec  1 04:31:39 np0005540741 podman[244135]: 2025-12-01 09:31:39.904703494 +0000 UTC m=+0.084636646 container remove 0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_euler, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec  1 04:31:39 np0005540741 systemd[1]: libpod-conmon-0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed.scope: Deactivated successfully.
Dec  1 04:31:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:31:39 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:31:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:31:39 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:31:39 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 222a10f9-4c30-4c00-bc67-6e5f56907cbc does not exist
Dec  1 04:31:40 np0005540741 python3.9[244327]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec  1 04:31:40 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:31:40 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:31:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v592: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:41 np0005540741 python3.9[244480]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 04:31:42 np0005540741 python3.9[244638]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  1 04:31:42 np0005540741 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:31:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:31:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:31:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:31:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:31:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:31:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:31:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v593: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:43 np0005540741 systemd-logind[788]: New session 51 of user zuul.
Dec  1 04:31:43 np0005540741 systemd[1]: Started Session 51 of User zuul.
Dec  1 04:31:43 np0005540741 podman[244674]: 2025-12-01 09:31:43.829752408 +0000 UTC m=+0.095260532 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:31:43 np0005540741 systemd[1]: session-51.scope: Deactivated successfully.
Dec  1 04:31:43 np0005540741 systemd-logind[788]: Session 51 logged out. Waiting for processes to exit.
Dec  1 04:31:43 np0005540741 systemd-logind[788]: Removed session 51.
Dec  1 04:31:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:31:44 np0005540741 python3.9[244846]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:31:45 np0005540741 python3.9[244967]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581504.0485396-1249-175207014057183/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v594: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:45 np0005540741 python3.9[245117]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:31:46 np0005540741 python3.9[245193]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:46 np0005540741 python3.9[245343]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:31:47 np0005540741 python3.9[245464]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581506.1904726-1249-127275162569772/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v595: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:47 np0005540741 python3.9[245614]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:31:48 np0005540741 python3.9[245735]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581507.3064673-1249-30708396262785/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:48 np0005540741 python3.9[245885]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:31:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:31:49 np0005540741 python3.9[246006]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581508.5360897-1249-255925187640533/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v596: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:50 np0005540741 python3.9[246156]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:31:50 np0005540741 python3.9[246277]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581509.6196575-1249-240847757232484/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:51 np0005540741 python3.9[246429]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v597: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:51 np0005540741 python3.9[246581]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:31:52 np0005540741 python3.9[246733]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:31:52 np0005540741 podman[246857]: 2025-12-01 09:31:52.981829952 +0000 UTC m=+0.091943117 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  1 04:31:53 np0005540741 python3.9[246900]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:31:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v598: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:53 np0005540741 python3.9[247034]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764581512.6393037-1356-255026109672473/.source _original_basename=.nsdtykkl follow=False checksum=99e1f0f07cd296b27b32892c19cd6590291d74e3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec  1 04:31:54 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:31:54 np0005540741 python3.9[247186]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:31:55 np0005540741 python3.9[247338]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:31:55 np0005540741 python3.9[247459]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581514.602423-1382-248618224208453/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v599: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:56 np0005540741 python3.9[247609]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:31:56 np0005540741 podman[247704]: 2025-12-01 09:31:56.553248342 +0000 UTC m=+0.051618876 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  1 04:31:56 np0005540741 python3.9[247743]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581515.7262225-1397-255598607723688/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:31:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v600: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:58 np0005540741 python3.9[247901]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec  1 04:31:58 np0005540741 python3.9[248053]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  1 04:31:59 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:31:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v601: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:31:59 np0005540741 python3[248205]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec  1 04:32:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v602: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v603: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:32:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v604: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v605: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:09 np0005540741 podman[248219]: 2025-12-01 09:32:09.244430055 +0000 UTC m=+9.441464531 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  1 04:32:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:32:09 np0005540741 podman[248317]: 2025-12-01 09:32:09.431961021 +0000 UTC m=+0.094397817 container create b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, container_name=nova_compute_init, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm)
Dec  1 04:32:09 np0005540741 podman[248317]: 2025-12-01 09:32:09.357970352 +0000 UTC m=+0.020407138 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  1 04:32:09 np0005540741 python3[248205]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec  1 04:32:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v606: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:10 np0005540741 python3.9[248507]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:32:11 np0005540741 python3.9[248661]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec  1 04:32:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v607: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:11 np0005540741 python3.9[248813]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  1 04:32:12 np0005540741 python3[248965]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:32:13
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['cephfs.cephfs.data', 'cephfs.cephfs.meta', 'volumes', 'vms', '.mgr', 'images', 'backups']
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:32:13 np0005540741 podman[249000]: 2025-12-01 09:32:13.094683397 +0000 UTC m=+0.079029106 container create a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute)
Dec  1 04:32:13 np0005540741 podman[249000]: 2025-12-01 09:32:13.039588501 +0000 UTC m=+0.023934260 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  1 04:32:13 np0005540741 python3[248965]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:32:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v608: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:13 np0005540741 podman[249191]: 2025-12-01 09:32:13.988406614 +0000 UTC m=+0.074125914 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:32:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:32:15 np0005540741 python3.9[249190]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:32:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v609: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:15 np0005540741 python3.9[249366]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:32:16 np0005540741 python3.9[249517]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764581536.0675564-1489-72346236261883/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:32:17 np0005540741 python3.9[249593]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 04:32:17 np0005540741 systemd[1]: Reloading.
Dec  1 04:32:17 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:32:17 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:32:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v610: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:18 np0005540741 python3.9[249704]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:32:18 np0005540741 systemd[1]: Reloading.
Dec  1 04:32:18 np0005540741 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:32:18 np0005540741 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:32:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:32:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:32:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:32:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:32:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:32:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:32:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:32:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:32:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:32:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:32:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:32:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:32:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:32:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:32:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:32:18 np0005540741 systemd[1]: Starting nova_compute container...
Dec  1 04:32:18 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:32:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:18 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:18 np0005540741 podman[249744]: 2025-12-01 09:32:18.716856926 +0000 UTC m=+0.125684778 container init a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Dec  1 04:32:18 np0005540741 podman[249744]: 2025-12-01 09:32:18.726092751 +0000 UTC m=+0.134920563 container start a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  1 04:32:18 np0005540741 podman[249744]: nova_compute
Dec  1 04:32:18 np0005540741 nova_compute[249760]: + sudo -E kolla_set_configs
Dec  1 04:32:18 np0005540741 systemd[1]: Started nova_compute container.
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Validating config file
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Copying service configuration files
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Deleting /etc/ceph
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Creating directory /etc/ceph
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /etc/ceph
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Writing out command to execute
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  1 04:32:18 np0005540741 nova_compute[249760]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  1 04:32:18 np0005540741 nova_compute[249760]: ++ cat /run_command
Dec  1 04:32:18 np0005540741 nova_compute[249760]: + CMD=nova-compute
Dec  1 04:32:18 np0005540741 nova_compute[249760]: + ARGS=
Dec  1 04:32:18 np0005540741 nova_compute[249760]: + sudo kolla_copy_cacerts
Dec  1 04:32:18 np0005540741 nova_compute[249760]: + [[ ! -n '' ]]
Dec  1 04:32:18 np0005540741 nova_compute[249760]: + . kolla_extend_start
Dec  1 04:32:18 np0005540741 nova_compute[249760]: + echo 'Running command: '\''nova-compute'\'''
Dec  1 04:32:18 np0005540741 nova_compute[249760]: Running command: 'nova-compute'
Dec  1 04:32:18 np0005540741 nova_compute[249760]: + umask 0022
Dec  1 04:32:18 np0005540741 nova_compute[249760]: + exec nova-compute
Dec  1 04:32:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v611: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:19 np0005540741 python3.9[249921]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:32:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:32:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:32:20.465 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:32:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:32:20.466 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:32:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:32:20.467 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:32:20 np0005540741 python3.9[250072]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:32:21 np0005540741 nova_compute[249760]: 2025-12-01 09:32:21.137 249764 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 04:32:21 np0005540741 nova_compute[249760]: 2025-12-01 09:32:21.138 249764 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 04:32:21 np0005540741 nova_compute[249760]: 2025-12-01 09:32:21.138 249764 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 04:32:21 np0005540741 nova_compute[249760]: 2025-12-01 09:32:21.138 249764 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  1 04:32:21 np0005540741 nova_compute[249760]: 2025-12-01 09:32:21.288 249764 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:32:21 np0005540741 python3.9[250224]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:32:21 np0005540741 nova_compute[249760]: 2025-12-01 09:32:21.314 249764 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:32:21 np0005540741 nova_compute[249760]: 2025-12-01 09:32:21.314 249764 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  1 04:32:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v612: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:21 np0005540741 nova_compute[249760]: 2025-12-01 09:32:21.990 249764 INFO nova.virt.driver [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.130 249764 INFO nova.compute.provider_config [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.146 249764 DEBUG oslo_concurrency.lockutils [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.146 249764 DEBUG oslo_concurrency.lockutils [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.147 249764 DEBUG oslo_concurrency.lockutils [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.147 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.148 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.148 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.148 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.148 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.148 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.149 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.149 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.149 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.149 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.150 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.150 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.150 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.150 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.151 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.151 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.151 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.151 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.151 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.152 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.152 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.152 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.152 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.153 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.153 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.153 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.153 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.153 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.154 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.154 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.154 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.154 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.155 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.155 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.155 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.155 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.155 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.156 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.156 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.156 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.156 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.157 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.157 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.157 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.157 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.157 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.158 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.158 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.158 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.158 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.158 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.158 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.159 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.159 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.159 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.159 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.159 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.159 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.160 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.160 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.160 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.160 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.160 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.160 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.160 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.161 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.161 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.161 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.161 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.161 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.161 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.161 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.162 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.162 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.162 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.162 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.162 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.162 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.163 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.163 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.163 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.163 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.163 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.163 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.163 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.164 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.164 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.164 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.164 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.164 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.164 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.165 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.165 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.165 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.165 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.165 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.165 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.165 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.166 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.166 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.166 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.166 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.166 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.166 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.166 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.167 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.167 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.167 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.167 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.167 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.167 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.167 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.168 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.168 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.168 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.168 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.168 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.168 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.168 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.169 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.169 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.169 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.169 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.169 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.169 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.169 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.170 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.170 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.170 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.170 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.170 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.170 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.171 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.171 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.171 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.171 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.171 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.171 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.171 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.172 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.172 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.172 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.172 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.172 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.172 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.173 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.173 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.173 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.173 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.173 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.173 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.173 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.174 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.174 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.174 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.174 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.174 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.174 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.175 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.175 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.175 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.175 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.175 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.175 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.175 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.176 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.176 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.176 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.176 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.176 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.176 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.177 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.177 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.177 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.177 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.177 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.177 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.177 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.178 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.178 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.178 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.178 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.178 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.178 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.179 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.179 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.179 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.179 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.179 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.179 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.179 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.180 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.180 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.180 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.180 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.180 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.180 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.180 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.181 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.181 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.181 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.181 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.181 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.181 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.181 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.182 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.182 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.182 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.182 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.182 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.182 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.183 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.183 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.183 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.183 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.183 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.183 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.183 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.184 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.184 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.184 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.184 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.184 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.184 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.184 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.185 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.185 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.185 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.185 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.185 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.185 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.185 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.186 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.186 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.186 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.186 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.186 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.186 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.187 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.187 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.187 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.187 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.187 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.187 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.187 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.188 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.188 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.188 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.188 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.188 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.188 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.189 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.189 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.189 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.189 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.189 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.189 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.189 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.190 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.190 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.190 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.190 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.190 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.190 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.190 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.191 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.191 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.191 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.191 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.191 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.191 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.192 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.192 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.192 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.192 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.192 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.192 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.192 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.193 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.193 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.193 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.193 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.193 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.193 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.193 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.194 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.194 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.194 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.194 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.194 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.194 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.194 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.195 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.195 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.195 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.195 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.195 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.195 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.195 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.196 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.196 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.196 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.196 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.196 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.196 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.197 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.197 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.197 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.197 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.197 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.197 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.197 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.198 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.198 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.198 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.198 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.198 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.198 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.199 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.199 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.199 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.199 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.199 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.199 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.199 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.200 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.200 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.200 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.200 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.200 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.200 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.200 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.201 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.201 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.201 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.201 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.201 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.201 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.202 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.202 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.202 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.202 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.202 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.202 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.202 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.203 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.203 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.203 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.203 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.203 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.203 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.203 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.204 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.204 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.204 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.204 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.204 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.205 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.205 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.205 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.205 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.205 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.205 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.206 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.206 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.206 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.206 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.206 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.206 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.206 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.207 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.207 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.207 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.207 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.207 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.207 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.207 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.208 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.208 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.208 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.208 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.209 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.209 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.209 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.209 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.209 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.209 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.210 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.210 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.210 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.210 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.210 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.211 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.211 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.211 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.211 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.211 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.211 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.211 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.212 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.212 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.212 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.212 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.212 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.212 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.213 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.213 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.213 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.213 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.213 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.213 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.213 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.214 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.214 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.214 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.214 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.214 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.214 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.215 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.215 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.215 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.215 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.215 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.215 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.215 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.216 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.216 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.216 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.216 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.216 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.216 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.216 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.217 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.217 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.217 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.217 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.217 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.217 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.217 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.218 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.218 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.218 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.218 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.218 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.218 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.219 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.219 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.219 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.219 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.219 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.219 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.220 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.220 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.220 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.220 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.220 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.220 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.221 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.221 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.221 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.221 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.221 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.221 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.221 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.222 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.222 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.222 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.222 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.222 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.222 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.222 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.223 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.223 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.223 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.223 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.223 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.223 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.223 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.224 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.224 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.224 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.224 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.224 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.224 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.225 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.225 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.225 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.225 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.225 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.225 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.226 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.226 249764 WARNING oslo_config.cfg [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  1 04:32:22 np0005540741 nova_compute[249760]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  1 04:32:22 np0005540741 nova_compute[249760]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  1 04:32:22 np0005540741 nova_compute[249760]: and ``live_migration_inbound_addr`` respectively.
Dec  1 04:32:22 np0005540741 nova_compute[249760]: ).  Its value may be silently ignored in the future.#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.226 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.226 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.226 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.227 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.227 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.227 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.227 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.227 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.227 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.227 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.228 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.228 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.228 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.228 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.228 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.228 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.229 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.229 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.229 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rbd_secret_uuid        = 5620a9fb-e540-5250-a0e8-7aaad5347e3b log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.229 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.229 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.229 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.229 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.230 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.230 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.230 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.230 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.230 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.230 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.231 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.231 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.231 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.231 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.231 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.231 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.232 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.232 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.232 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.232 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.232 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.232 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.232 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.233 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.233 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.233 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.233 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.233 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.233 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.234 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.234 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.234 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.234 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.234 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.234 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.234 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.235 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.235 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.235 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.235 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.235 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.235 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.236 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.236 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.236 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.236 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.236 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.236 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.236 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.237 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.237 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.237 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.237 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.237 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.237 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.237 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.238 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.238 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.238 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.238 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.238 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.238 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.238 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.239 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.239 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.239 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.239 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.239 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.239 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.240 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.240 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.240 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.240 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.240 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.240 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.240 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.241 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.241 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.241 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.241 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.241 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.241 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.241 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.242 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.242 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.242 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.242 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.242 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.242 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.243 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.243 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.243 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.243 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.243 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.243 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.243 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.244 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.244 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.244 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.244 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.244 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.244 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.244 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.245 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.245 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.245 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.245 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.245 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.245 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.245 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.246 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.246 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.246 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.246 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.246 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.246 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.247 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.247 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.247 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.247 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.247 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.247 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.248 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.248 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.248 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.248 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.248 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.248 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.249 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.249 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.249 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.249 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.249 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.249 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.249 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.250 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.250 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.250 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.250 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.250 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.250 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.250 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.251 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.251 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.251 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.251 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.251 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.251 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.252 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.252 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.252 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.252 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.252 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.252 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.252 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.253 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.253 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.253 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.253 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.253 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.253 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.254 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.254 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.254 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.254 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.254 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.254 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.255 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.255 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.255 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.255 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.255 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.255 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.255 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.256 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.256 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.256 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.256 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.256 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.256 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.257 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.257 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.257 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.257 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.257 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.257 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.258 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.258 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.258 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.258 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.258 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.258 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.258 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.259 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.259 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.259 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.259 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.259 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.259 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.259 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.260 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.260 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.260 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.260 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.260 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.260 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.260 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.261 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.261 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.261 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.261 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.261 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.261 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.262 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.262 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.262 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.262 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.262 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.262 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.263 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.263 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.263 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.263 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.263 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.263 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.263 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.264 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.264 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.264 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.264 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.264 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.265 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.265 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.265 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.265 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.265 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.265 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.266 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.266 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.266 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.266 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.266 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.266 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.266 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.267 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.267 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.267 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.267 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.267 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.267 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.267 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.268 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.268 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.268 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.268 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.268 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.268 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.268 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.269 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.269 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.269 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.269 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.269 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.269 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.270 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.270 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.270 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.270 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.270 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.270 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.270 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.271 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.271 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.271 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.271 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.271 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.271 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.272 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.272 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.272 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.272 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.272 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.272 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.272 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.273 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.273 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.273 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.273 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.273 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.273 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.274 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.274 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.274 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.274 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.274 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.274 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.274 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.275 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.275 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.275 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.275 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.275 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.275 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.275 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.276 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.276 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.276 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.276 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.276 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.276 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.276 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.277 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.277 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.277 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.277 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.277 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.277 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.278 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.278 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.278 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.278 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.278 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.278 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.278 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.279 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.279 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.279 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.279 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.279 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.279 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.280 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.280 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.280 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.280 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.280 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.280 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.280 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.281 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.281 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.281 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.281 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.281 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.281 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.281 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.282 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.282 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.282 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.282 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.282 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.282 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.282 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.283 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.283 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.283 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.283 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.283 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.283 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.283 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.284 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.284 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.284 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.284 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.284 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.284 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.285 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.285 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.285 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.285 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.285 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.285 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.285 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.286 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.286 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.286 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.286 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.286 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.286 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.286 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.287 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.287 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.287 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.287 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.287 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.287 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.288 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.288 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.288 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.288 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.288 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.288 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.288 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.289 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.289 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.289 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.289 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.289 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.289 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.289 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.290 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.290 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.290 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.290 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.290 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.290 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.290 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.292 249764 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.312 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.313 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.313 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.313 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  1 04:32:22 np0005540741 systemd[1]: Starting libvirt QEMU daemon...
Dec  1 04:32:22 np0005540741 systemd[1]: Started libvirt QEMU daemon.
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.402 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f4d0fde95b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.406 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f4d0fde95b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.407 249764 INFO nova.virt.libvirt.driver [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.433 249764 WARNING nova.virt.libvirt.driver [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.#033[00m
Dec  1 04:32:22 np0005540741 nova_compute[249760]: 2025-12-01 09:32:22.434 249764 DEBUG nova.virt.libvirt.volume.mount [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec  1 04:32:22 np0005540741 python3.9[250378]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  1 04:32:22 np0005540741 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:32:22 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:32:22 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3004 writes, 12K keys, 3004 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.01 MB/s#012Cumulative WAL: 3004 writes, 3004 syncs, 1.00 writes per sync, written: 0.01 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1278 writes, 5303 keys, 1278 commit groups, 1.0 writes per commit group, ingest: 5.67 MB, 0.01 MB/s#012Interval WAL: 1278 writes, 1278 syncs, 1.00 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    109.6      0.09              0.04         6    0.015       0      0       0.0       0.0#012  L6      1/0    4.44 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.4    150.7    122.6      0.19              0.09         5    0.038     16K   2271       0.0       0.0#012 Sum      1/0    4.44 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.4    102.7    118.5      0.28              0.12        11    0.026     16K   2271       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   4.4    130.7    133.8      0.14              0.05         6    0.023     10K   1498       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0    150.7    122.6      0.19              0.09         5    0.038     16K   2271       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    112.2      0.09              0.04         5    0.017       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     19.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.010, interval 0.004#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.03 GB write, 0.03 MB/s write, 0.03 GB read, 0.02 MB/s read, 0.3 seconds#012Interval compaction: 0.02 GB write, 0.03 MB/s write, 0.02 GB read, 0.03 MB/s read, 0.1 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bbd56b51f0#2 capacity: 308.00 MB usage: 1.30 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 6.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(81,1.15 MB,0.374767%) FilterBlock(12,52.61 KB,0.0166806%) IndexBlock(12,99.75 KB,0.0316273%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  1 04:32:23 np0005540741 podman[250566]: 2025-12-01 09:32:23.153174682 +0000 UTC m=+0.094428538 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 2025-12-01 09:32:23.357 249764 INFO nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Libvirt host capabilities <capabilities>
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <host>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <uuid>52310927-1d30-4bda-9d2b-fd9f7cfadc4d</uuid>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <cpu>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <arch>x86_64</arch>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model>EPYC-Rome-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <vendor>AMD</vendor>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <microcode version='16777317'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <signature family='23' model='49' stepping='0'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='x2apic'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='tsc-deadline'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='osxsave'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='hypervisor'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='tsc_adjust'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='spec-ctrl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='stibp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='arch-capabilities'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='ssbd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='cmp_legacy'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='topoext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='virt-ssbd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='lbrv'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='tsc-scale'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='vmcb-clean'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='pause-filter'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='pfthreshold'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='svme-addr-chk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='rdctl-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='skip-l1dfl-vmentry'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='mds-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature name='pschange-mc-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <pages unit='KiB' size='4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <pages unit='KiB' size='2048'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <pages unit='KiB' size='1048576'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </cpu>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <power_management>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <suspend_mem/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </power_management>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <iommu support='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <migration_features>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <live/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <uri_transports>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <uri_transport>tcp</uri_transport>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <uri_transport>rdma</uri_transport>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </uri_transports>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </migration_features>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <topology>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <cells num='1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <cell id='0'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:          <memory unit='KiB'>7864320</memory>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:          <pages unit='KiB' size='4'>1966080</pages>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:          <pages unit='KiB' size='2048'>0</pages>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:          <distances>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:            <sibling id='0' value='10'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:          </distances>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:          <cpus num='8'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:          </cpus>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        </cell>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </cells>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </topology>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <cache>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </cache>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <secmodel>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model>selinux</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <doi>0</doi>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </secmodel>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <secmodel>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model>dac</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <doi>0</doi>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </secmodel>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </host>
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <guest>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <os_type>hvm</os_type>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <arch name='i686'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <wordsize>32</wordsize>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <domain type='qemu'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <domain type='kvm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </arch>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <features>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <pae/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <nonpae/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <acpi default='on' toggle='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <apic default='on' toggle='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <cpuselection/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <deviceboot/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <disksnapshot default='on' toggle='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <externalSnapshot/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </features>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </guest>
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <guest>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <os_type>hvm</os_type>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <arch name='x86_64'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <wordsize>64</wordsize>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <domain type='qemu'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <domain type='kvm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </arch>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <features>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <acpi default='on' toggle='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <apic default='on' toggle='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <cpuselection/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <deviceboot/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <disksnapshot default='on' toggle='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <externalSnapshot/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </features>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </guest>
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 
Dec  1 04:32:23 np0005540741 nova_compute[249760]: </capabilities>
Dec  1 04:32:23 np0005540741 nova_compute[249760]: #033[00m
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 2025-12-01 09:32:23.365 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 2025-12-01 09:32:23.389 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  1 04:32:23 np0005540741 nova_compute[249760]: <domainCapabilities>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <domain>kvm</domain>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <arch>i686</arch>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <vcpu max='4096'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <iothreads supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <os supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <enum name='firmware'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <loader supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>rom</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pflash</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='readonly'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>yes</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>no</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='secure'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>no</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </loader>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </os>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <cpu>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='host-passthrough' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='hostPassthroughMigratable'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>on</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>off</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='maximum' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='maximumMigratable'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>on</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>off</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='host-model' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <vendor>AMD</vendor>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='x2apic'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='hypervisor'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='stibp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='ssbd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='overflow-recov'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='succor'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='ibrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='lbrv'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='tsc-scale'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='flushbyasid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='pause-filter'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='pfthreshold'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='disable' name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='custom' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cooperlake'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cooperlake-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cooperlake-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Dhyana-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Genoa'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amd-psfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='auto-ibrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='stibp-always-on'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amd-psfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='auto-ibrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='stibp-always-on'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Milan'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Milan-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Milan-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amd-psfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='stibp-always-on'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='GraniteRapids'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='prefetchiti'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='GraniteRapids-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='prefetchiti'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='GraniteRapids-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10-128'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10-256'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10-512'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='prefetchiti'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v6'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v7'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='KnightsMill'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512er'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512pf'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='KnightsMill-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512er'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512pf'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 04:32:23 np0005540741 python3.9[250637]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G4-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tbm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G5-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tbm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SierraForest'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cmpccxadd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SierraForest-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cmpccxadd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='athlon'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='athlon-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='core2duo'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='core2duo-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='coreduo'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='coreduo-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='n270'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='n270-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='phenom'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='phenom-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </cpu>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <memoryBacking supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <enum name='sourceType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>file</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>anonymous</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>memfd</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </memoryBacking>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <devices>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <disk supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='diskDevice'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>disk</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>cdrom</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>floppy</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>lun</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='bus'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>fdc</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>scsi</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>usb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>sata</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-non-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </disk>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <graphics supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vnc</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>egl-headless</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>dbus</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </graphics>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <video supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='modelType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vga</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>cirrus</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>none</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>bochs</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>ramfb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </video>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <hostdev supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='mode'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>subsystem</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='startupPolicy'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>default</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>mandatory</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>requisite</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>optional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='subsysType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>usb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pci</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>scsi</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='capsType'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='pciBackend'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </hostdev>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <rng supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-non-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendModel'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>random</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>egd</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>builtin</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </rng>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <filesystem supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='driverType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>path</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>handle</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtiofs</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </filesystem>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <tpm supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tpm-tis</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tpm-crb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendModel'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>emulator</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>external</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendVersion'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>2.0</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </tpm>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <redirdev supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='bus'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>usb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </redirdev>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <channel supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pty</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>unix</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </channel>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <crypto supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>qemu</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendModel'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>builtin</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </crypto>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <interface supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>default</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>passt</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </interface>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <panic supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>isa</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>hyperv</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </panic>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <console supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>null</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vc</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pty</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>dev</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>file</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pipe</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>stdio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>udp</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tcp</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>unix</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>qemu-vdagent</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>dbus</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </console>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </devices>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <features>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <gic supported='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <vmcoreinfo supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <genid supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <backingStoreInput supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <backup supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <async-teardown supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <ps2 supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <sev supported='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <sgx supported='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <hyperv supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='features'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>relaxed</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vapic</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>spinlocks</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vpindex</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>runtime</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>synic</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>stimer</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>reset</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vendor_id</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>frequencies</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>reenlightenment</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tlbflush</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>ipi</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>avic</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>emsr_bitmap</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>xmm_input</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <defaults>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <spinlocks>4095</spinlocks>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <stimer_direct>on</stimer_direct>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </defaults>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </hyperv>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <launchSecurity supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='sectype'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tdx</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </launchSecurity>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </features>
Dec  1 04:32:23 np0005540741 nova_compute[249760]: </domainCapabilities>
Dec  1 04:32:23 np0005540741 nova_compute[249760]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 2025-12-01 09:32:23.398 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  1 04:32:23 np0005540741 nova_compute[249760]: <domainCapabilities>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <domain>kvm</domain>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <arch>i686</arch>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <vcpu max='240'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <iothreads supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <os supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <enum name='firmware'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <loader supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>rom</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pflash</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='readonly'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>yes</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>no</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='secure'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>no</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </loader>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </os>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <cpu>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='host-passthrough' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='hostPassthroughMigratable'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>on</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>off</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='maximum' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='maximumMigratable'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>on</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>off</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='host-model' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <vendor>AMD</vendor>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='x2apic'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='hypervisor'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='stibp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='ssbd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='overflow-recov'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='succor'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='ibrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='lbrv'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='tsc-scale'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='flushbyasid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='pause-filter'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='pfthreshold'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='disable' name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='custom' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cooperlake'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cooperlake-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cooperlake-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 systemd[1]: Stopping nova_compute container...
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Dhyana-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Genoa'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amd-psfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='auto-ibrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='stibp-always-on'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amd-psfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='auto-ibrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='stibp-always-on'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Milan'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Milan-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Milan-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amd-psfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='stibp-always-on'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='GraniteRapids'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='prefetchiti'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='GraniteRapids-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='prefetchiti'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='GraniteRapids-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10-128'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10-256'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10-512'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='prefetchiti'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v6'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v7'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='KnightsMill'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512er'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512pf'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='KnightsMill-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512er'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512pf'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G4-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tbm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G5-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tbm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SierraForest'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cmpccxadd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SierraForest-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cmpccxadd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='athlon'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='athlon-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='core2duo'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='core2duo-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='coreduo'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='coreduo-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='n270'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='n270-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='phenom'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='phenom-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </cpu>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <memoryBacking supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <enum name='sourceType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>file</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>anonymous</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>memfd</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </memoryBacking>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <devices>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <disk supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='diskDevice'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>disk</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>cdrom</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>floppy</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>lun</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='bus'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>ide</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>fdc</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>scsi</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>usb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>sata</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-non-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </disk>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <graphics supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vnc</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>egl-headless</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>dbus</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </graphics>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <video supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='modelType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vga</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>cirrus</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>none</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>bochs</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>ramfb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </video>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <hostdev supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='mode'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>subsystem</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='startupPolicy'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>default</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>mandatory</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>requisite</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>optional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='subsysType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>usb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pci</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>scsi</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='capsType'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='pciBackend'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </hostdev>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <rng supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-non-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendModel'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>random</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>egd</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>builtin</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </rng>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <filesystem supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='driverType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>path</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>handle</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtiofs</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </filesystem>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <tpm supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tpm-tis</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tpm-crb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendModel'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>emulator</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>external</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendVersion'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>2.0</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </tpm>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <redirdev supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='bus'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>usb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </redirdev>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <channel supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pty</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>unix</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </channel>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <crypto supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>qemu</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendModel'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>builtin</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </crypto>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <interface supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>default</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>passt</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </interface>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <panic supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>isa</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>hyperv</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </panic>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <console supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>null</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vc</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pty</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>dev</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>file</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pipe</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>stdio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>udp</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tcp</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>unix</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>qemu-vdagent</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>dbus</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </console>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </devices>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <features>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <gic supported='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <vmcoreinfo supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <genid supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <backingStoreInput supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <backup supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <async-teardown supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <ps2 supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <sev supported='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <sgx supported='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <hyperv supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='features'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>relaxed</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vapic</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>spinlocks</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vpindex</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>runtime</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>synic</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>stimer</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>reset</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vendor_id</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>frequencies</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>reenlightenment</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tlbflush</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>ipi</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>avic</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>emsr_bitmap</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>xmm_input</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <defaults>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <spinlocks>4095</spinlocks>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <stimer_direct>on</stimer_direct>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </defaults>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </hyperv>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <launchSecurity supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='sectype'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tdx</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </launchSecurity>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </features>
Dec  1 04:32:23 np0005540741 nova_compute[249760]: </domainCapabilities>
Dec  1 04:32:23 np0005540741 nova_compute[249760]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 2025-12-01 09:32:23.434 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 2025-12-01 09:32:23.438 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec  1 04:32:23 np0005540741 nova_compute[249760]: <domainCapabilities>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <domain>kvm</domain>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <arch>x86_64</arch>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <vcpu max='4096'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <iothreads supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <os supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <enum name='firmware'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>efi</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <loader supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>rom</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pflash</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='readonly'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>yes</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>no</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='secure'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>yes</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>no</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </loader>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </os>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <cpu>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='host-passthrough' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='hostPassthroughMigratable'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>on</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>off</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='maximum' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='maximumMigratable'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>on</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>off</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='host-model' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <vendor>AMD</vendor>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='x2apic'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='hypervisor'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='stibp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='ssbd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='overflow-recov'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='succor'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='ibrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='lbrv'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='tsc-scale'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='flushbyasid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='pause-filter'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='pfthreshold'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='disable' name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='custom' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cooperlake'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cooperlake-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cooperlake-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Dhyana-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Genoa'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amd-psfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='auto-ibrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='stibp-always-on'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amd-psfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='auto-ibrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='stibp-always-on'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Milan'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Milan-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Milan-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amd-psfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='stibp-always-on'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='GraniteRapids'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='prefetchiti'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='GraniteRapids-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='prefetchiti'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='GraniteRapids-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10-128'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10-256'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10-512'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='prefetchiti'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v6'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v7'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='KnightsMill'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512er'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512pf'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='KnightsMill-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512er'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512pf'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G4-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tbm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G5-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tbm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SierraForest'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cmpccxadd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SierraForest-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cmpccxadd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='athlon'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='athlon-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='core2duo'>
Dec  1 04:32:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v613: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='core2duo-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='coreduo'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='coreduo-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='n270'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='n270-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='phenom'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='phenom-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </cpu>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <memoryBacking supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <enum name='sourceType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>file</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>anonymous</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>memfd</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </memoryBacking>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <devices>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <disk supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='diskDevice'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>disk</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>cdrom</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>floppy</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>lun</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='bus'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>fdc</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>scsi</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>usb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>sata</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-non-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </disk>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <graphics supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vnc</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>egl-headless</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>dbus</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </graphics>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <video supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='modelType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vga</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>cirrus</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>none</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>bochs</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>ramfb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </video>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <hostdev supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='mode'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>subsystem</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='startupPolicy'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>default</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>mandatory</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>requisite</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>optional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='subsysType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>usb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pci</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>scsi</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='capsType'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='pciBackend'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </hostdev>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <rng supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-non-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendModel'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>random</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>egd</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>builtin</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </rng>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <filesystem supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='driverType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>path</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>handle</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtiofs</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </filesystem>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <tpm supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tpm-tis</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tpm-crb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendModel'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>emulator</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>external</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendVersion'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>2.0</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </tpm>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <redirdev supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='bus'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>usb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </redirdev>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <channel supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pty</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>unix</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </channel>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <crypto supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>qemu</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendModel'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>builtin</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </crypto>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <interface supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>default</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>passt</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </interface>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <panic supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>isa</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>hyperv</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </panic>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <console supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>null</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vc</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pty</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>dev</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>file</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pipe</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>stdio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>udp</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tcp</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>unix</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>qemu-vdagent</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>dbus</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </console>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </devices>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <features>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <gic supported='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <vmcoreinfo supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <genid supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <backingStoreInput supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <backup supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <async-teardown supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <ps2 supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <sev supported='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <sgx supported='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <hyperv supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='features'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>relaxed</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vapic</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>spinlocks</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vpindex</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>runtime</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>synic</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>stimer</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>reset</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vendor_id</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>frequencies</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>reenlightenment</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tlbflush</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>ipi</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>avic</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>emsr_bitmap</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>xmm_input</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <defaults>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <spinlocks>4095</spinlocks>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <stimer_direct>on</stimer_direct>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </defaults>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </hyperv>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <launchSecurity supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='sectype'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tdx</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </launchSecurity>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </features>
Dec  1 04:32:23 np0005540741 nova_compute[249760]: </domainCapabilities>
Dec  1 04:32:23 np0005540741 nova_compute[249760]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 2025-12-01 09:32:23.507 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec  1 04:32:23 np0005540741 nova_compute[249760]: <domainCapabilities>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <domain>kvm</domain>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <arch>x86_64</arch>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <vcpu max='240'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <iothreads supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <os supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <enum name='firmware'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <loader supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>rom</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pflash</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='readonly'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>yes</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>no</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='secure'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>no</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </loader>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </os>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <cpu>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='host-passthrough' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='hostPassthroughMigratable'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>on</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>off</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='maximum' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='maximumMigratable'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>on</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>off</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='host-model' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <vendor>AMD</vendor>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='x2apic'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='hypervisor'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='stibp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='ssbd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='overflow-recov'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='succor'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='ibrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='lbrv'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='tsc-scale'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='flushbyasid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='pause-filter'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='pfthreshold'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <feature policy='disable' name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <mode name='custom' supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Broadwell-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cooperlake'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cooperlake-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Cooperlake-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Denverton-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Dhyana-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Genoa'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amd-psfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='auto-ibrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='stibp-always-on'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amd-psfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='auto-ibrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='stibp-always-on'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Milan'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Milan-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Milan-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amd-psfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='stibp-always-on'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-Rome-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='EPYC-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='GraniteRapids'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='prefetchiti'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='GraniteRapids-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='prefetchiti'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='GraniteRapids-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10-128'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10-256'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx10-512'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='prefetchiti'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Haswell-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v6'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Icelake-Server-v7'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='IvyBridge-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='KnightsMill'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512er'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512pf'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='KnightsMill-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512er'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512pf'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G4-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tbm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Opteron_G5-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fma4'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tbm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xop'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SapphireRapids-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='amx-tile'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-bf16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-fp16'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bitalg'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrc'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fzrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='la57'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='taa-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xfd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SierraForest'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cmpccxadd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='SierraForest-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ifma'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cmpccxadd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fbsdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='fsrs'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ibrs-all'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mcdt-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pbrsb-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='psdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='serialize'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vaes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Client-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='hle'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='rtm'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Skylake-Server-v5'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512bw'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512cd'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512dq'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512f'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='avx512vl'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='invpcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pcid'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='pku'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='mpx'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v2'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v3'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='core-capability'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='split-lock-detect'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='Snowridge-v4'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='cldemote'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='erms'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='gfni'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdir64b'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='movdiri'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='xsaves'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='athlon'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='athlon-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='core2duo'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='core2duo-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='coreduo'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='coreduo-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='n270'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='n270-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='ss'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='phenom'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <blockers model='phenom-v1'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnow'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <feature name='3dnowext'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </blockers>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </mode>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </cpu>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <memoryBacking supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <enum name='sourceType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>file</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>anonymous</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <value>memfd</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </memoryBacking>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <devices>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <disk supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='diskDevice'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>disk</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>cdrom</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>floppy</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>lun</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='bus'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>ide</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>fdc</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>scsi</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>usb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>sata</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-non-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </disk>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <graphics supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vnc</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>egl-headless</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>dbus</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </graphics>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <video supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='modelType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vga</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>cirrus</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>none</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>bochs</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>ramfb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </video>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <hostdev supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='mode'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>subsystem</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='startupPolicy'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>default</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>mandatory</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>requisite</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>optional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='subsysType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>usb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pci</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>scsi</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='capsType'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='pciBackend'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </hostdev>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <rng supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtio-non-transitional</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendModel'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>random</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>egd</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>builtin</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </rng>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <filesystem supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='driverType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>path</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>handle</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>virtiofs</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </filesystem>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <tpm supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tpm-tis</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tpm-crb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendModel'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>emulator</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>external</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendVersion'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>2.0</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </tpm>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <redirdev supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='bus'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>usb</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </redirdev>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <channel supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pty</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>unix</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </channel>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <crypto supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>qemu</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendModel'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>builtin</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </crypto>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <interface supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='backendType'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>default</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>passt</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </interface>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <panic supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='model'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>isa</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>hyperv</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </panic>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <console supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='type'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>null</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vc</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pty</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>dev</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>file</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>pipe</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>stdio</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>udp</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tcp</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>unix</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>qemu-vdagent</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>dbus</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </console>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </devices>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  <features>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <gic supported='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <vmcoreinfo supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <genid supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <backingStoreInput supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <backup supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <async-teardown supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <ps2 supported='yes'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <sev supported='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <sgx supported='no'/>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <hyperv supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='features'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>relaxed</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vapic</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>spinlocks</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vpindex</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>runtime</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>synic</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>stimer</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>reset</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>vendor_id</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>frequencies</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>reenlightenment</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tlbflush</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>ipi</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>avic</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>emsr_bitmap</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>xmm_input</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <defaults>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <spinlocks>4095</spinlocks>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <stimer_direct>on</stimer_direct>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </defaults>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </hyperv>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    <launchSecurity supported='yes'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      <enum name='sectype'>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:        <value>tdx</value>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:      </enum>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:    </launchSecurity>
Dec  1 04:32:23 np0005540741 nova_compute[249760]:  </features>
Dec  1 04:32:23 np0005540741 nova_compute[249760]: </domainCapabilities>
Dec  1 04:32:23 np0005540741 nova_compute[249760]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 2025-12-01 09:32:23.568 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 2025-12-01 09:32:23.568 249764 INFO nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Secure Boot support detected#033[00m
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 2025-12-01 09:32:23.569 249764 DEBUG oslo_concurrency.lockutils [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 2025-12-01 09:32:23.569 249764 DEBUG oslo_concurrency.lockutils [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 04:32:23 np0005540741 nova_compute[249760]: 2025-12-01 09:32:23.570 249764 DEBUG oslo_concurrency.lockutils [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 04:32:23 np0005540741 virtqemud[250400]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec  1 04:32:23 np0005540741 virtqemud[250400]: hostname: compute-0
Dec  1 04:32:23 np0005540741 virtqemud[250400]: End of file while reading data: Input/output error
Dec  1 04:32:23 np0005540741 systemd[1]: libpod-a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8.scope: Deactivated successfully.
Dec  1 04:32:23 np0005540741 systemd[1]: libpod-a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8.scope: Consumed 3.294s CPU time.
Dec  1 04:32:23 np0005540741 podman[250647]: 2025-12-01 09:32:23.987350046 +0000 UTC m=+0.511325085 container died a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team)
Dec  1 04:32:24 np0005540741 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8-userdata-shm.mount: Deactivated successfully.
Dec  1 04:32:24 np0005540741 systemd[1]: var-lib-containers-storage-overlay-f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49-merged.mount: Deactivated successfully.
Dec  1 04:32:24 np0005540741 podman[250647]: 2025-12-01 09:32:24.578100655 +0000 UTC m=+1.102075674 container cleanup a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  1 04:32:24 np0005540741 podman[250647]: nova_compute
Dec  1 04:32:24 np0005540741 podman[250677]: nova_compute
Dec  1 04:32:24 np0005540741 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec  1 04:32:24 np0005540741 systemd[1]: Stopped nova_compute container.
Dec  1 04:32:24 np0005540741 systemd[1]: Starting nova_compute container...
Dec  1 04:32:24 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:32:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:24 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:24 np0005540741 podman[250690]: 2025-12-01 09:32:24.818631176 +0000 UTC m=+0.122525006 container init a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible)
Dec  1 04:32:24 np0005540741 podman[250690]: 2025-12-01 09:32:24.832571148 +0000 UTC m=+0.136464908 container start a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec  1 04:32:24 np0005540741 nova_compute[250706]: + sudo -E kolla_set_configs
Dec  1 04:32:24 np0005540741 podman[250690]: nova_compute
Dec  1 04:32:24 np0005540741 systemd[1]: Started nova_compute container.
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Validating config file
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Copying service configuration files
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Deleting /etc/ceph
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Creating directory /etc/ceph
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /etc/ceph
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Writing out command to execute
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  1 04:32:24 np0005540741 nova_compute[250706]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  1 04:32:24 np0005540741 nova_compute[250706]: ++ cat /run_command
Dec  1 04:32:24 np0005540741 nova_compute[250706]: + CMD=nova-compute
Dec  1 04:32:24 np0005540741 nova_compute[250706]: + ARGS=
Dec  1 04:32:24 np0005540741 nova_compute[250706]: + sudo kolla_copy_cacerts
Dec  1 04:32:24 np0005540741 nova_compute[250706]: + [[ ! -n '' ]]
Dec  1 04:32:24 np0005540741 nova_compute[250706]: + . kolla_extend_start
Dec  1 04:32:24 np0005540741 nova_compute[250706]: Running command: 'nova-compute'
Dec  1 04:32:24 np0005540741 nova_compute[250706]: + echo 'Running command: '\''nova-compute'\'''
Dec  1 04:32:24 np0005540741 nova_compute[250706]: + umask 0022
Dec  1 04:32:24 np0005540741 nova_compute[250706]: + exec nova-compute
Dec  1 04:32:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:32:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v614: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:25 np0005540741 python3.9[250869]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  1 04:32:25 np0005540741 systemd[1]: Started libpod-conmon-b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c.scope.
Dec  1 04:32:25 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:32:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93a45807daa2df8bc0367a7472b5d2e2c9f0891beaa6ea83744c4fe907ed08e1/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93a45807daa2df8bc0367a7472b5d2e2c9f0891beaa6ea83744c4fe907ed08e1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93a45807daa2df8bc0367a7472b5d2e2c9f0891beaa6ea83744c4fe907ed08e1/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:26 np0005540741 podman[250896]: 2025-12-01 09:32:26.009586696 +0000 UTC m=+0.133604566 container init b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:32:26 np0005540741 podman[250896]: 2025-12-01 09:32:26.016209576 +0000 UTC m=+0.140227426 container start b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  1 04:32:26 np0005540741 python3.9[250869]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec  1 04:32:26 np0005540741 nova_compute_init[250918]: INFO:nova_statedir:Applying nova statedir ownership
Dec  1 04:32:26 np0005540741 nova_compute_init[250918]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec  1 04:32:26 np0005540741 nova_compute_init[250918]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec  1 04:32:26 np0005540741 nova_compute_init[250918]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec  1 04:32:26 np0005540741 nova_compute_init[250918]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec  1 04:32:26 np0005540741 nova_compute_init[250918]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec  1 04:32:26 np0005540741 nova_compute_init[250918]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec  1 04:32:26 np0005540741 nova_compute_init[250918]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec  1 04:32:26 np0005540741 nova_compute_init[250918]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec  1 04:32:26 np0005540741 nova_compute_init[250918]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec  1 04:32:26 np0005540741 nova_compute_init[250918]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec  1 04:32:26 np0005540741 nova_compute_init[250918]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec  1 04:32:26 np0005540741 nova_compute_init[250918]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec  1 04:32:26 np0005540741 nova_compute_init[250918]: INFO:nova_statedir:Nova statedir ownership complete
Dec  1 04:32:26 np0005540741 systemd[1]: libpod-b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c.scope: Deactivated successfully.
Dec  1 04:32:26 np0005540741 podman[250932]: 2025-12-01 09:32:26.11120915 +0000 UTC m=+0.026115702 container died b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init)
Dec  1 04:32:26 np0005540741 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c-userdata-shm.mount: Deactivated successfully.
Dec  1 04:32:26 np0005540741 systemd[1]: var-lib-containers-storage-overlay-93a45807daa2df8bc0367a7472b5d2e2c9f0891beaa6ea83744c4fe907ed08e1-merged.mount: Deactivated successfully.
Dec  1 04:32:26 np0005540741 podman[250932]: 2025-12-01 09:32:26.140961816 +0000 UTC m=+0.055868378 container cleanup b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  1 04:32:26 np0005540741 systemd[1]: libpod-conmon-b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c.scope: Deactivated successfully.
Dec  1 04:32:26 np0005540741 systemd[1]: session-50.scope: Deactivated successfully.
Dec  1 04:32:26 np0005540741 systemd[1]: session-50.scope: Consumed 2min 21.501s CPU time.
Dec  1 04:32:26 np0005540741 systemd-logind[788]: Session 50 logged out. Waiting for processes to exit.
Dec  1 04:32:26 np0005540741 systemd-logind[788]: Removed session 50.
Dec  1 04:32:26 np0005540741 nova_compute[250706]: 2025-12-01 09:32:26.884 250710 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 04:32:26 np0005540741 nova_compute[250706]: 2025-12-01 09:32:26.885 250710 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 04:32:26 np0005540741 nova_compute[250706]: 2025-12-01 09:32:26.885 250710 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 04:32:26 np0005540741 nova_compute[250706]: 2025-12-01 09:32:26.885 250710 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  1 04:32:26 np0005540741 podman[250981]: 2025-12-01 09:32:26.937570929 +0000 UTC m=+0.066578097 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.021 250710 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.042 250710 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.043 250710 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.526 250710 INFO nova.virt.driver [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  1 04:32:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v615: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.626 250710 INFO nova.compute.provider_config [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.643 250710 DEBUG oslo_concurrency.lockutils [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.643 250710 DEBUG oslo_concurrency.lockutils [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.643 250710 DEBUG oslo_concurrency.lockutils [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.644 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.644 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.644 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.644 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.644 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.645 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.645 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.645 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.645 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.645 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.645 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.645 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.646 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.646 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.646 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.646 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.646 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.646 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.647 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.647 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.647 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.647 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.647 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.647 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.648 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.648 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.648 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.648 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.648 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.649 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.649 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.649 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.649 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.649 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.650 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.650 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.650 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.650 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.650 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.650 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.651 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.651 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.651 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.651 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.652 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.652 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.652 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.652 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.652 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.652 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.653 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.653 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.653 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.653 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.653 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.653 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.653 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.654 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.654 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.654 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.654 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.654 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.654 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.654 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.655 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.655 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.655 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.655 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.655 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.655 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.655 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.656 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.656 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.656 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.656 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.656 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.656 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.656 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.657 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.657 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.657 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.657 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.657 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.657 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.658 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.658 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.658 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.658 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.658 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.658 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.658 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.659 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.659 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.659 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.659 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.659 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.659 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.659 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.660 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.660 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.660 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.660 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.660 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.660 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.660 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.661 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.661 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.661 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.661 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.661 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.661 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.661 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.662 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.662 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.662 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.662 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.662 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.662 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.662 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.663 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.663 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.663 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.663 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.663 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.663 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.663 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.664 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.664 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.664 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.664 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.664 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.664 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.664 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.665 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.665 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.665 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.665 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.665 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.665 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.665 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.666 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.666 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.666 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.666 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.666 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.666 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.667 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.667 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.667 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.667 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.667 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.667 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.667 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.668 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.668 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.668 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.668 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.668 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.668 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.669 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.669 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.669 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.669 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.669 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.669 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.669 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.670 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.670 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.670 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.670 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.670 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.670 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.670 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.671 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.671 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.671 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.671 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.671 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.671 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.672 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.672 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.672 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.672 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.672 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.672 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.672 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.673 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.673 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.673 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.673 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.673 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.673 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.674 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.674 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.674 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.674 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.674 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.674 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.674 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.675 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.675 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.675 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.675 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.675 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.675 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.675 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.676 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.676 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.676 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.676 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.676 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.676 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.676 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.677 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.677 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.677 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.677 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.677 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.677 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.677 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.678 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.678 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.678 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.678 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.678 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.678 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.678 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.679 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.679 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.679 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.679 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.679 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.679 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.680 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.680 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.680 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.680 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.680 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.680 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.680 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.681 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.681 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.681 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.681 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.681 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.681 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.683 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.683 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.683 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.683 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.683 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.683 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.683 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.684 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.684 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.684 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.684 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.684 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.684 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.685 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.685 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.685 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.685 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.685 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.685 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.685 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.686 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.686 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.686 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.686 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.686 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.686 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.686 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.687 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.687 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.687 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.687 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.687 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.687 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.687 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.688 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.688 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.688 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.688 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.688 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.688 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.688 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.689 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.689 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.689 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.689 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.689 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.689 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.689 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.690 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.690 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.690 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.690 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.690 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.690 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.691 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.691 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.691 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.691 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.691 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.691 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.691 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.692 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.692 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.692 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.692 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.692 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.692 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.692 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.693 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.693 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.693 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.693 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.693 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.693 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.694 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.694 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.694 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.694 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.694 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.694 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.694 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.695 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.695 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.695 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.695 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.695 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.695 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.695 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.696 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.696 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.696 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.696 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.696 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.696 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.696 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.697 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.697 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.697 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.697 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.697 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.697 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.698 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.698 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.698 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.698 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.698 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.698 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.699 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.699 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.699 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.699 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.699 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.699 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.699 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.700 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.700 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.700 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.700 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.700 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.700 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.701 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.701 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.701 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.701 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.701 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.701 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.701 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.702 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.702 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.702 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.702 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.702 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.702 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.702 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.703 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.703 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.703 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.703 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.703 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.703 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.703 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.704 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.704 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.704 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.704 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.704 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.704 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.704 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.705 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.705 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.705 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.705 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.705 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.705 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.705 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.706 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.706 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.706 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.706 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.706 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.706 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.706 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.707 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.707 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.707 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.707 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.707 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.707 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.708 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.708 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.708 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.708 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.708 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.708 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.709 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.709 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.709 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.709 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.709 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.709 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.710 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.710 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.710 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.710 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.710 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.710 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.710 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.711 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.711 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.711 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.711 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.711 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.711 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.712 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.712 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.712 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.712 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.712 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.712 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.713 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.713 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.713 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.713 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.713 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.713 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.714 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.714 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.714 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.714 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.714 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.714 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.714 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.715 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.715 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.715 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.715 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.715 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.716 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.716 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.716 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.716 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.716 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.717 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.717 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.717 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.717 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.717 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.717 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.718 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.718 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.718 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.718 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.718 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.718 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.719 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.719 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.719 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.719 250710 WARNING oslo_config.cfg [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  1 04:32:27 np0005540741 nova_compute[250706]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  1 04:32:27 np0005540741 nova_compute[250706]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  1 04:32:27 np0005540741 nova_compute[250706]: and ``live_migration_inbound_addr`` respectively.
Dec  1 04:32:27 np0005540741 nova_compute[250706]: ).  Its value may be silently ignored in the future.#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.719 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.720 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.720 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.720 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.720 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.720 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.720 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.721 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.721 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.721 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.721 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.721 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.721 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.722 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.722 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.722 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.722 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.722 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.723 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rbd_secret_uuid        = 5620a9fb-e540-5250-a0e8-7aaad5347e3b log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.723 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.723 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.723 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.723 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.723 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.723 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.724 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.724 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.724 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.724 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.724 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.724 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.725 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.725 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.725 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.725 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.725 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.726 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.726 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.726 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.726 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.726 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.727 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.727 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.727 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.727 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.727 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.728 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.728 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.728 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.728 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.728 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.728 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.729 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.729 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.729 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.729 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.729 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.729 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.730 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.730 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.730 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.730 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.730 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.730 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.731 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.731 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.731 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.731 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.731 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.731 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.731 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.732 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.732 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.732 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.732 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.732 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.732 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.733 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.733 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.733 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.733 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.733 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.733 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.734 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.734 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.734 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.734 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.734 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.734 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.734 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.735 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.735 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.735 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.735 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.735 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.735 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.735 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.737 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.737 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.737 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.737 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.737 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.737 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.737 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.738 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.738 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.738 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.738 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.738 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.738 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.738 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.739 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.739 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.739 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.739 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.739 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.739 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.739 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.740 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.740 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.740 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.740 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.740 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.740 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.741 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.741 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.741 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.741 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.741 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.741 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.741 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.742 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.742 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.742 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.742 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.742 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.743 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.743 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.743 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.743 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.743 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.743 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.743 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.744 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.744 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.744 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.744 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.744 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.744 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.745 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.745 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.745 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.745 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.745 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.745 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.746 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.746 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.746 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.746 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.746 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.746 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.747 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.747 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.747 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.747 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.747 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.747 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.747 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.748 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.748 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.748 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.748 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.748 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.748 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.749 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.749 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.749 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.749 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.749 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.749 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.750 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.750 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.750 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.750 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.750 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.750 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.751 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.751 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.751 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.751 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.752 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.752 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.752 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.752 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.752 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.752 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.752 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.753 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.753 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.753 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.753 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.753 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.753 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.753 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.754 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.754 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.754 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.754 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.754 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.754 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.755 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.755 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.755 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.755 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.755 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.755 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.755 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.756 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.756 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.756 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.756 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.756 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.756 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.756 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.757 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.757 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.757 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.757 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.757 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.757 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.758 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.758 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.758 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.758 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.758 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.758 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.759 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.759 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.759 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.759 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.759 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.759 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.760 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.760 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.760 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.760 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.760 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.761 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.761 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.761 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.761 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.761 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.762 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.762 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.762 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.762 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.762 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.763 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.763 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.763 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.763 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.763 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.763 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.764 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.764 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.764 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.764 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.764 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.765 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.765 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.765 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.765 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.766 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.766 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.766 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.766 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.766 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.766 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.767 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.767 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.767 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.767 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.767 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.767 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.768 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.768 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.768 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.768 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.768 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.768 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.768 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.769 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.769 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.769 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.769 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.769 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.769 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.770 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.770 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.770 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.770 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.770 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.770 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.770 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.771 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.771 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.771 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.771 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.771 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.771 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.771 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.772 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.772 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.772 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.772 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.772 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.772 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.772 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.773 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.773 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.773 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.773 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.773 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.773 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.774 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.774 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.774 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.774 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.774 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.774 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.774 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.775 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.775 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.775 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.775 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.775 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.775 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.776 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.776 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.776 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.776 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.776 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.776 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.776 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.777 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.777 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.777 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.777 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.777 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.778 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.778 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.778 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.778 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.778 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.778 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.778 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.779 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.779 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.779 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.779 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.779 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.779 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.779 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.780 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.780 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.780 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.780 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.780 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.781 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.781 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.781 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.781 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.781 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.781 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.781 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.782 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.782 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.782 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.782 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.782 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.782 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.782 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.783 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.783 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.783 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.783 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.783 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.783 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.784 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.784 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.784 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.784 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.784 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.784 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.784 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.785 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.785 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.785 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.785 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.785 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.785 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.786 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.786 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.786 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.786 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.786 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.786 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.787 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.787 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.787 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.787 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.787 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.787 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.788 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.788 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.788 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.789 250710 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.806 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.806 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.806 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.807 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.829 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f5359419580> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.832 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f5359419580> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.833 250710 INFO nova.virt.libvirt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.842 250710 INFO nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Libvirt host capabilities <capabilities>
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <host>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <uuid>52310927-1d30-4bda-9d2b-fd9f7cfadc4d</uuid>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <cpu>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <arch>x86_64</arch>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model>EPYC-Rome-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <vendor>AMD</vendor>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <microcode version='16777317'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <signature family='23' model='49' stepping='0'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='x2apic'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='tsc-deadline'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='osxsave'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='hypervisor'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='tsc_adjust'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='spec-ctrl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='stibp'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='arch-capabilities'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='ssbd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='cmp_legacy'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='topoext'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='virt-ssbd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='lbrv'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='tsc-scale'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='vmcb-clean'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='pause-filter'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='pfthreshold'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='svme-addr-chk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='rdctl-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='skip-l1dfl-vmentry'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='mds-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature name='pschange-mc-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <pages unit='KiB' size='4'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <pages unit='KiB' size='2048'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <pages unit='KiB' size='1048576'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </cpu>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <power_management>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <suspend_mem/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </power_management>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <iommu support='no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <migration_features>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <live/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <uri_transports>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <uri_transport>tcp</uri_transport>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <uri_transport>rdma</uri_transport>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </uri_transports>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </migration_features>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <topology>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <cells num='1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <cell id='0'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:          <memory unit='KiB'>7864320</memory>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:          <pages unit='KiB' size='4'>1966080</pages>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:          <pages unit='KiB' size='2048'>0</pages>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:          <distances>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:            <sibling id='0' value='10'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:          </distances>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:          <cpus num='8'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:          </cpus>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        </cell>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </cells>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </topology>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <cache>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </cache>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <secmodel>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model>selinux</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <doi>0</doi>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </secmodel>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <secmodel>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model>dac</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <doi>0</doi>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </secmodel>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  </host>
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <guest>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <os_type>hvm</os_type>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <arch name='i686'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <wordsize>32</wordsize>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <domain type='qemu'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <domain type='kvm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </arch>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <features>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <pae/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <nonpae/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <acpi default='on' toggle='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <apic default='on' toggle='no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <cpuselection/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <deviceboot/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <disksnapshot default='on' toggle='no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <externalSnapshot/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </features>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  </guest>
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <guest>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <os_type>hvm</os_type>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <arch name='x86_64'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <wordsize>64</wordsize>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <domain type='qemu'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <domain type='kvm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </arch>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <features>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <acpi default='on' toggle='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <apic default='on' toggle='no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <cpuselection/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <deviceboot/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <disksnapshot default='on' toggle='no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <externalSnapshot/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </features>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  </guest>
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 
Dec  1 04:32:27 np0005540741 nova_compute[250706]: </capabilities>
Dec  1 04:32:27 np0005540741 nova_compute[250706]: #033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.847 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.848 250710 WARNING nova.virt.libvirt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.849 250710 DEBUG nova.virt.libvirt.volume.mount [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.852 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  1 04:32:27 np0005540741 nova_compute[250706]: <domainCapabilities>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <domain>kvm</domain>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <arch>i686</arch>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <vcpu max='4096'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <iothreads supported='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <os supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <enum name='firmware'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <loader supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>rom</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>pflash</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='readonly'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>yes</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>no</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='secure'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>no</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </loader>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  </os>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <cpu>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <mode name='host-passthrough' supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='hostPassthroughMigratable'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>on</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>off</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <mode name='maximum' supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='maximumMigratable'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>on</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>off</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <mode name='host-model' supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <vendor>AMD</vendor>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='x2apic'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='hypervisor'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='stibp'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='ssbd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='overflow-recov'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='succor'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='ibrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='lbrv'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='tsc-scale'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='flushbyasid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='pause-filter'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='pfthreshold'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='disable' name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <mode name='custom' supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-noTSX'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cooperlake'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cooperlake-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cooperlake-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Denverton'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Denverton-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Denverton-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Denverton-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Dhyana-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Genoa'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amd-psfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='auto-ibrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='stibp-always-on'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amd-psfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='auto-ibrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='stibp-always-on'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Milan'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Milan-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Milan-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amd-psfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='stibp-always-on'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='GraniteRapids'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='prefetchiti'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='GraniteRapids-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='prefetchiti'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='GraniteRapids-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx10'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx10-128'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx10-256'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx10-512'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='prefetchiti'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell-noTSX'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v5'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v6'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v7'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='KnightsMill'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512er'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512pf'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='KnightsMill-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512er'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512pf'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G4-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G5'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tbm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G5-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tbm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='SierraForest'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cmpccxadd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='SierraForest-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cmpccxadd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v5'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Snowridge'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='athlon'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='athlon-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='core2duo'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='core2duo-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='coreduo'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='coreduo-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='n270'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='n270-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='phenom'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='phenom-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  </cpu>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <memoryBacking supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <enum name='sourceType'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <value>file</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <value>anonymous</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <value>memfd</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  </memoryBacking>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <devices>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <disk supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='diskDevice'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>disk</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>cdrom</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>floppy</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>lun</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='bus'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>fdc</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>scsi</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>usb</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>sata</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio-transitional</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio-non-transitional</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </disk>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <graphics supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>vnc</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>egl-headless</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>dbus</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </graphics>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <video supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='modelType'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>vga</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>cirrus</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>none</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>bochs</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>ramfb</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </video>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <hostdev supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='mode'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>subsystem</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='startupPolicy'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>default</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>mandatory</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>requisite</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>optional</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='subsysType'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>usb</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>pci</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>scsi</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='capsType'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='pciBackend'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </hostdev>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <rng supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio-transitional</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio-non-transitional</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='backendModel'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>random</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>egd</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>builtin</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </rng>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <filesystem supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='driverType'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>path</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>handle</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtiofs</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </filesystem>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <tpm supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>tpm-tis</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>tpm-crb</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='backendModel'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>emulator</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>external</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='backendVersion'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>2.0</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </tpm>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <redirdev supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='bus'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>usb</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </redirdev>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <channel supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>pty</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>unix</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </channel>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <crypto supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='model'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>qemu</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='backendModel'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>builtin</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </crypto>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <interface supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='backendType'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>default</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>passt</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </interface>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <panic supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>isa</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>hyperv</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </panic>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <console supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>null</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>vc</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>pty</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>dev</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>file</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>pipe</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>stdio</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>udp</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>tcp</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>unix</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>qemu-vdagent</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>dbus</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </console>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  </devices>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <features>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <gic supported='no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <vmcoreinfo supported='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <genid supported='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <backingStoreInput supported='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <backup supported='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <async-teardown supported='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <ps2 supported='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <sev supported='no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <sgx supported='no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <hyperv supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='features'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>relaxed</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>vapic</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>spinlocks</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>vpindex</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>runtime</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>synic</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>stimer</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>reset</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>vendor_id</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>frequencies</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>reenlightenment</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>tlbflush</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>ipi</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>avic</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>emsr_bitmap</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>xmm_input</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <defaults>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <spinlocks>4095</spinlocks>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <stimer_direct>on</stimer_direct>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </defaults>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </hyperv>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <launchSecurity supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='sectype'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>tdx</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </launchSecurity>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  </features>
Dec  1 04:32:27 np0005540741 nova_compute[250706]: </domainCapabilities>
Dec  1 04:32:27 np0005540741 nova_compute[250706]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.858 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  1 04:32:27 np0005540741 nova_compute[250706]: <domainCapabilities>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <domain>kvm</domain>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <arch>i686</arch>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <vcpu max='240'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <iothreads supported='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <os supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <enum name='firmware'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <loader supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>rom</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>pflash</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='readonly'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>yes</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>no</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='secure'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>no</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </loader>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  </os>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <cpu>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <mode name='host-passthrough' supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='hostPassthroughMigratable'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>on</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>off</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <mode name='maximum' supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='maximumMigratable'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>on</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>off</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <mode name='host-model' supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <vendor>AMD</vendor>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='x2apic'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='hypervisor'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='stibp'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='ssbd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='overflow-recov'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='succor'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='ibrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='lbrv'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='tsc-scale'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='flushbyasid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='pause-filter'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='pfthreshold'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='disable' name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <mode name='custom' supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-noTSX'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cooperlake'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cooperlake-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cooperlake-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Denverton'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Denverton-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Denverton-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Denverton-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Dhyana-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Genoa'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amd-psfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='auto-ibrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='stibp-always-on'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amd-psfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='auto-ibrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='stibp-always-on'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Milan'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Milan-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Milan-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amd-psfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='stibp-always-on'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='GraniteRapids'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='prefetchiti'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='GraniteRapids-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='prefetchiti'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='GraniteRapids-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx10'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx10-128'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx10-256'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx10-512'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='prefetchiti'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell-noTSX'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v5'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v6'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v7'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='KnightsMill'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512er'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512pf'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='KnightsMill-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512er'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512pf'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G4-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G5'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tbm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G5-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tbm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='SierraForest'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cmpccxadd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='SierraForest-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cmpccxadd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v5'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Snowridge'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='athlon'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='athlon-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='core2duo'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='core2duo-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='coreduo'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='coreduo-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='n270'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='n270-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='phenom'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='phenom-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  </cpu>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <memoryBacking supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <enum name='sourceType'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <value>file</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <value>anonymous</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <value>memfd</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  </memoryBacking>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <devices>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <disk supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='diskDevice'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>disk</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>cdrom</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>floppy</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>lun</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='bus'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>ide</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>fdc</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>scsi</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>usb</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>sata</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio-transitional</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio-non-transitional</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </disk>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <graphics supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>vnc</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>egl-headless</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>dbus</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </graphics>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <video supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='modelType'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>vga</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>cirrus</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>none</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>bochs</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>ramfb</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </video>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <hostdev supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='mode'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>subsystem</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='startupPolicy'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>default</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>mandatory</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>requisite</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>optional</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='subsysType'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>usb</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>pci</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>scsi</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='capsType'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='pciBackend'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </hostdev>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <rng supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio-transitional</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtio-non-transitional</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='backendModel'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>random</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>egd</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>builtin</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </rng>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <filesystem supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='driverType'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>path</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>handle</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>virtiofs</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </filesystem>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <tpm supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>tpm-tis</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>tpm-crb</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='backendModel'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>emulator</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>external</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='backendVersion'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>2.0</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </tpm>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <redirdev supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='bus'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>usb</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </redirdev>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <channel supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>pty</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>unix</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </channel>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <crypto supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='model'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>qemu</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='backendModel'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>builtin</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </crypto>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <interface supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='backendType'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>default</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>passt</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </interface>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <panic supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>isa</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>hyperv</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </panic>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <console supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>null</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>vc</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>pty</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>dev</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>file</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>pipe</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>stdio</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>udp</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>tcp</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>unix</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>qemu-vdagent</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>dbus</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </console>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  </devices>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <features>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <gic supported='no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <vmcoreinfo supported='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <genid supported='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <backingStoreInput supported='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <backup supported='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <async-teardown supported='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <ps2 supported='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <sev supported='no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <sgx supported='no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <hyperv supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='features'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>relaxed</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>vapic</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>spinlocks</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>vpindex</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>runtime</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>synic</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>stimer</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>reset</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>vendor_id</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>frequencies</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>reenlightenment</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>tlbflush</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>ipi</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>avic</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>emsr_bitmap</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>xmm_input</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <defaults>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <spinlocks>4095</spinlocks>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <stimer_direct>on</stimer_direct>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </defaults>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </hyperv>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <launchSecurity supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='sectype'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>tdx</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </launchSecurity>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  </features>
Dec  1 04:32:27 np0005540741 nova_compute[250706]: </domainCapabilities>
Dec  1 04:32:27 np0005540741 nova_compute[250706]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.902 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  1 04:32:27 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.907 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec  1 04:32:27 np0005540741 nova_compute[250706]: <domainCapabilities>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <domain>kvm</domain>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <arch>x86_64</arch>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <vcpu max='4096'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <iothreads supported='yes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <os supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <enum name='firmware'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <value>efi</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <loader supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>rom</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>pflash</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='readonly'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>yes</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>no</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='secure'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>yes</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>no</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </loader>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  </os>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:  <cpu>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <mode name='host-passthrough' supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='hostPassthroughMigratable'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>on</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>off</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <mode name='maximum' supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <enum name='maximumMigratable'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>on</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <value>off</value>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <mode name='host-model' supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <vendor>AMD</vendor>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='x2apic'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='hypervisor'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='stibp'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='ssbd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='overflow-recov'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='succor'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='ibrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='lbrv'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='tsc-scale'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='flushbyasid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='pause-filter'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='pfthreshold'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <feature policy='disable' name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:    <mode name='custom' supported='yes'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-noTSX'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cooperlake'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cooperlake-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Cooperlake-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Denverton'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Denverton-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Denverton-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Denverton-v3'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='Dhyana-v2'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Genoa'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amd-psfd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='auto-ibrs'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='stibp-always-on'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 04:32:27 np0005540741 nova_compute[250706]:        <feature name='amd-psfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='auto-ibrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='stibp-always-on'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Milan'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Milan-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Milan-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amd-psfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='stibp-always-on'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-v4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='GraniteRapids'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='prefetchiti'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='GraniteRapids-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='prefetchiti'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='GraniteRapids-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx10'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx10-128'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx10-256'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx10-512'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='prefetchiti'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell-noTSX'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v5'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v6'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v7'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='KnightsMill'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512er'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512pf'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='KnightsMill-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512er'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512pf'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G4-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G5'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tbm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G5-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tbm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='SierraForest'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cmpccxadd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='SierraForest-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cmpccxadd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v5'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Snowridge'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='athlon'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='athlon-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='core2duo'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='core2duo-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='coreduo'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='coreduo-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='n270'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='n270-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='phenom'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='phenom-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  </cpu>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  <memoryBacking supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <enum name='sourceType'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <value>file</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <value>anonymous</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <value>memfd</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  </memoryBacking>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  <devices>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <disk supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='diskDevice'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>disk</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>cdrom</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>floppy</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>lun</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='bus'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>fdc</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>scsi</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>usb</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>sata</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio-transitional</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio-non-transitional</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </disk>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <graphics supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>vnc</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>egl-headless</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>dbus</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </graphics>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <video supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='modelType'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>vga</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>cirrus</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>none</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>bochs</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>ramfb</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </video>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <hostdev supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='mode'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>subsystem</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='startupPolicy'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>default</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>mandatory</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>requisite</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>optional</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='subsysType'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>usb</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>pci</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>scsi</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='capsType'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='pciBackend'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </hostdev>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <rng supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio-transitional</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio-non-transitional</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='backendModel'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>random</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>egd</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>builtin</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </rng>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <filesystem supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='driverType'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>path</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>handle</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtiofs</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </filesystem>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <tpm supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>tpm-tis</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>tpm-crb</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='backendModel'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>emulator</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>external</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='backendVersion'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>2.0</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </tpm>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <redirdev supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='bus'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>usb</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </redirdev>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <channel supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>pty</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>unix</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </channel>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <crypto supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='model'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>qemu</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='backendModel'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>builtin</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </crypto>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <interface supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='backendType'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>default</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>passt</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </interface>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <panic supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>isa</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>hyperv</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </panic>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <console supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>null</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>vc</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>pty</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>dev</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>file</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>pipe</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>stdio</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>udp</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>tcp</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>unix</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>qemu-vdagent</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>dbus</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </console>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  </devices>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  <features>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <gic supported='no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <vmcoreinfo supported='yes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <genid supported='yes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <backingStoreInput supported='yes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <backup supported='yes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <async-teardown supported='yes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <ps2 supported='yes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <sev supported='no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <sgx supported='no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <hyperv supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='features'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>relaxed</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>vapic</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>spinlocks</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>vpindex</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>runtime</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>synic</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>stimer</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>reset</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>vendor_id</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>frequencies</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>reenlightenment</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>tlbflush</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>ipi</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>avic</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>emsr_bitmap</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>xmm_input</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <defaults>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <spinlocks>4095</spinlocks>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <stimer_direct>on</stimer_direct>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </defaults>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </hyperv>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <launchSecurity supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='sectype'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>tdx</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </launchSecurity>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  </features>
Dec  1 04:32:28 np0005540741 nova_compute[250706]: </domainCapabilities>
Dec  1 04:32:28 np0005540741 nova_compute[250706]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:27.963 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec  1 04:32:28 np0005540741 nova_compute[250706]: <domainCapabilities>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  <domain>kvm</domain>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  <arch>x86_64</arch>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  <vcpu max='240'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  <iothreads supported='yes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  <os supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <enum name='firmware'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <loader supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>rom</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>pflash</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='readonly'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>yes</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>no</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='secure'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>no</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </loader>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  </os>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  <cpu>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <mode name='host-passthrough' supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='hostPassthroughMigratable'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>on</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>off</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <mode name='maximum' supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='maximumMigratable'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>on</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>off</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <mode name='host-model' supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <vendor>AMD</vendor>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='x2apic'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='hypervisor'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='stibp'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='ssbd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='overflow-recov'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='succor'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='ibrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='lbrv'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='tsc-scale'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='flushbyasid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='pause-filter'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='pfthreshold'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <feature policy='disable' name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <mode name='custom' supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Broadwell'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-noTSX'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Broadwell-v4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Cooperlake'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Cooperlake-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Cooperlake-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Denverton'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Denverton-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Denverton-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Denverton-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Dhyana-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Genoa'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amd-psfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='auto-ibrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='stibp-always-on'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amd-psfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='auto-ibrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='stibp-always-on'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Milan'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Milan-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Milan-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amd-psfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='no-nested-data-bp'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='null-sel-clr-base'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='stibp-always-on'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-Rome-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='EPYC-v4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='GraniteRapids'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='prefetchiti'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='GraniteRapids-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='prefetchiti'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='GraniteRapids-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx10'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx10-128'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx10-256'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx10-512'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='prefetchiti'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell-noTSX'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Haswell-v4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v5'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v6'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Icelake-Server-v7'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='IvyBridge-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='KnightsMill'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512er'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512pf'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='KnightsMill-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-4fmaps'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-4vnniw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512er'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512pf'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G4-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G5'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tbm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Opteron_G5-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fma4'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tbm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xop'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='SapphireRapids-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='amx-tile'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-bf16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-fp16'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512-vpopcntdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bitalg'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vbmi2'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrc'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fzrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='la57'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='taa-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='tsx-ldtrk'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xfd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='SierraForest'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cmpccxadd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='SierraForest-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-ifma'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-ne-convert'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx-vnni-int8'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='bus-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cmpccxadd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fbsdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='fsrs'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ibrs-all'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mcdt-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pbrsb-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='psdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='sbdr-ssdp-no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='serialize'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vaes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='vpclmulqdq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Client-v4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='hle'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='rtm'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Skylake-Server-v5'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512bw'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512cd'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512dq'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512f'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='avx512vl'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='invpcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pcid'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='pku'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Snowridge'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='mpx'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v2'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v3'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='core-capability'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='split-lock-detect'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='Snowridge-v4'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='cldemote'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='erms'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='gfni'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdir64b'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='movdiri'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='xsaves'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='athlon'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='athlon-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='core2duo'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='core2duo-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='coreduo'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='coreduo-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='n270'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='n270-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='ss'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='phenom'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <blockers model='phenom-v1'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnow'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <feature name='3dnowext'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </blockers>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </mode>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  </cpu>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  <memoryBacking supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <enum name='sourceType'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <value>file</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <value>anonymous</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <value>memfd</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  </memoryBacking>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  <devices>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <disk supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='diskDevice'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>disk</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>cdrom</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>floppy</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>lun</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='bus'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>ide</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>fdc</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>scsi</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>usb</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>sata</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio-transitional</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio-non-transitional</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </disk>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <graphics supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>vnc</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>egl-headless</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>dbus</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </graphics>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <video supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='modelType'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>vga</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>cirrus</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>none</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>bochs</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>ramfb</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </video>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <hostdev supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='mode'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>subsystem</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='startupPolicy'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>default</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>mandatory</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>requisite</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>optional</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='subsysType'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>usb</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>pci</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>scsi</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='capsType'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='pciBackend'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </hostdev>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <rng supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio-transitional</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtio-non-transitional</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='backendModel'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>random</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>egd</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>builtin</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </rng>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <filesystem supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='driverType'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>path</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>handle</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>virtiofs</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </filesystem>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <tpm supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>tpm-tis</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>tpm-crb</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='backendModel'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>emulator</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>external</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='backendVersion'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>2.0</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </tpm>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <redirdev supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='bus'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>usb</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </redirdev>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <channel supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>pty</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>unix</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </channel>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <crypto supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='model'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>qemu</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='backendModel'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>builtin</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </crypto>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <interface supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='backendType'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>default</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>passt</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </interface>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <panic supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='model'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>isa</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>hyperv</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </panic>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <console supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='type'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>null</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>vc</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>pty</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>dev</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>file</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>pipe</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>stdio</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>udp</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>tcp</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>unix</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>qemu-vdagent</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>dbus</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </console>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  </devices>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  <features>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <gic supported='no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <vmcoreinfo supported='yes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <genid supported='yes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <backingStoreInput supported='yes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <backup supported='yes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <async-teardown supported='yes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <ps2 supported='yes'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <sev supported='no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <sgx supported='no'/>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <hyperv supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='features'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>relaxed</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>vapic</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>spinlocks</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>vpindex</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>runtime</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>synic</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>stimer</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>reset</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>vendor_id</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>frequencies</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>reenlightenment</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>tlbflush</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>ipi</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>avic</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>emsr_bitmap</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>xmm_input</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <defaults>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <spinlocks>4095</spinlocks>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <stimer_direct>on</stimer_direct>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </defaults>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </hyperv>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    <launchSecurity supported='yes'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      <enum name='sectype'>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:        <value>tdx</value>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:      </enum>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:    </launchSecurity>
Dec  1 04:32:28 np0005540741 nova_compute[250706]:  </features>
Dec  1 04:32:28 np0005540741 nova_compute[250706]: </domainCapabilities>
Dec  1 04:32:28 np0005540741 nova_compute[250706]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:28.023 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:28.023 250710 INFO nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Secure Boot support detected#033[00m
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:28.027 250710 INFO nova.virt.libvirt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:28.028 250710 INFO nova.virt.libvirt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:28.041 250710 DEBUG nova.virt.libvirt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:28.086 250710 INFO nova.virt.node [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Determined node identity 847e3dbe-0f76-4032-a374-8c965945c22f from /var/lib/nova/compute_id#033[00m
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:28.120 250710 WARNING nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Compute nodes ['847e3dbe-0f76-4032-a374-8c965945c22f'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:28.169 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:28.213 250710 WARNING nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.#033[00m
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:28.214 250710 DEBUG oslo_concurrency.lockutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:28.215 250710 DEBUG oslo_concurrency.lockutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:28.215 250710 DEBUG oslo_concurrency.lockutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:28.215 250710 DEBUG nova.compute.resource_tracker [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:28.216 250710 DEBUG oslo_concurrency.processutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:32:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec  1 04:32:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2759139554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 04:32:28 np0005540741 nova_compute[250706]: 2025-12-01 09:32:28.688 250710 DEBUG oslo_concurrency.processutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:32:28 np0005540741 systemd[1]: Starting libvirt nodedev daemon...
Dec  1 04:32:28 np0005540741 systemd[1]: Started libvirt nodedev daemon.
Dec  1 04:32:29 np0005540741 nova_compute[250706]: 2025-12-01 09:32:29.059 250710 WARNING nova.virt.libvirt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 04:32:29 np0005540741 nova_compute[250706]: 2025-12-01 09:32:29.060 250710 DEBUG nova.compute.resource_tracker [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5314MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 04:32:29 np0005540741 nova_compute[250706]: 2025-12-01 09:32:29.061 250710 DEBUG oslo_concurrency.lockutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:32:29 np0005540741 nova_compute[250706]: 2025-12-01 09:32:29.061 250710 DEBUG oslo_concurrency.lockutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:32:29 np0005540741 nova_compute[250706]: 2025-12-01 09:32:29.085 250710 WARNING nova.compute.resource_tracker [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] No compute node record for compute-0.ctlplane.example.com:847e3dbe-0f76-4032-a374-8c965945c22f: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 847e3dbe-0f76-4032-a374-8c965945c22f could not be found.#033[00m
Dec  1 04:32:29 np0005540741 nova_compute[250706]: 2025-12-01 09:32:29.120 250710 INFO nova.compute.resource_tracker [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 847e3dbe-0f76-4032-a374-8c965945c22f#033[00m
Dec  1 04:32:29 np0005540741 nova_compute[250706]: 2025-12-01 09:32:29.220 250710 DEBUG nova.compute.resource_tracker [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 04:32:29 np0005540741 nova_compute[250706]: 2025-12-01 09:32:29.220 250710 DEBUG nova.compute.resource_tracker [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 04:32:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v616: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:32:30 np0005540741 nova_compute[250706]: 2025-12-01 09:32:30.579 250710 INFO nova.scheduler.client.report [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [req-0f32e5bd-c8a2-45dd-8409-cb45e76985ee] Created resource provider record via placement API for resource provider with UUID 847e3dbe-0f76-4032-a374-8c965945c22f and name compute-0.ctlplane.example.com.#033[00m
Dec  1 04:32:31 np0005540741 nova_compute[250706]: 2025-12-01 09:32:31.019 250710 DEBUG oslo_concurrency.processutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:32:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec  1 04:32:31 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1601777968' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 04:32:31 np0005540741 nova_compute[250706]: 2025-12-01 09:32:31.433 250710 DEBUG oslo_concurrency.processutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:32:31 np0005540741 nova_compute[250706]: 2025-12-01 09:32:31.441 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec  1 04:32:31 np0005540741 nova_compute[250706]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Dec  1 04:32:31 np0005540741 nova_compute[250706]: 2025-12-01 09:32:31.442 250710 INFO nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] kernel doesn't support AMD SEV#033[00m
Dec  1 04:32:31 np0005540741 nova_compute[250706]: 2025-12-01 09:32:31.444 250710 DEBUG nova.compute.provider_tree [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Updating inventory in ProviderTree for provider 847e3dbe-0f76-4032-a374-8c965945c22f with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 04:32:31 np0005540741 nova_compute[250706]: 2025-12-01 09:32:31.445 250710 DEBUG nova.virt.libvirt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  1 04:32:31 np0005540741 nova_compute[250706]: 2025-12-01 09:32:31.515 250710 DEBUG nova.scheduler.client.report [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Updated inventory for provider 847e3dbe-0f76-4032-a374-8c965945c22f with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Dec  1 04:32:31 np0005540741 nova_compute[250706]: 2025-12-01 09:32:31.515 250710 DEBUG nova.compute.provider_tree [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Updating resource provider 847e3dbe-0f76-4032-a374-8c965945c22f generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec  1 04:32:31 np0005540741 nova_compute[250706]: 2025-12-01 09:32:31.516 250710 DEBUG nova.compute.provider_tree [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Updating inventory in ProviderTree for provider 847e3dbe-0f76-4032-a374-8c965945c22f with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 04:32:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v617: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:31 np0005540741 nova_compute[250706]: 2025-12-01 09:32:31.660 250710 DEBUG nova.compute.provider_tree [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Updating resource provider 847e3dbe-0f76-4032-a374-8c965945c22f generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec  1 04:32:31 np0005540741 nova_compute[250706]: 2025-12-01 09:32:31.697 250710 DEBUG nova.compute.resource_tracker [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 04:32:31 np0005540741 nova_compute[250706]: 2025-12-01 09:32:31.698 250710 DEBUG oslo_concurrency.lockutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:32:31 np0005540741 nova_compute[250706]: 2025-12-01 09:32:31.698 250710 DEBUG nova.service [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Dec  1 04:32:31 np0005540741 nova_compute[250706]: 2025-12-01 09:32:31.861 250710 DEBUG nova.service [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Dec  1 04:32:31 np0005540741 nova_compute[250706]: 2025-12-01 09:32:31.862 250710 DEBUG nova.servicegroup.drivers.db [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Dec  1 04:32:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v618: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:35 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:32:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v619: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v620: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v621: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:32:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:32:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:32:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:32:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:32:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:32:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:32:40 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 16a85b5a-5546-429e-9698-a440e28dfd21 does not exist
Dec  1 04:32:40 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 82b4de4d-df1d-481e-8d7b-873fa8be56f6 does not exist
Dec  1 04:32:40 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 93e6d012-ce34-4722-b533-47eee0438b67 does not exist
Dec  1 04:32:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:32:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:32:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:32:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:32:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:32:40 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:32:41 np0005540741 podman[251364]: 2025-12-01 09:32:41.504042615 +0000 UTC m=+0.048677612 container create b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_galileo, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:32:41 np0005540741 systemd[1]: Started libpod-conmon-b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df.scope.
Dec  1 04:32:41 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:32:41 np0005540741 podman[251364]: 2025-12-01 09:32:41.478035677 +0000 UTC m=+0.022670734 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:32:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v622: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:41 np0005540741 podman[251364]: 2025-12-01 09:32:41.655413021 +0000 UTC m=+0.200048018 container init b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_galileo, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef)
Dec  1 04:32:41 np0005540741 podman[251364]: 2025-12-01 09:32:41.662856095 +0000 UTC m=+0.207491102 container start b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_galileo, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec  1 04:32:41 np0005540741 elastic_galileo[251380]: 167 167
Dec  1 04:32:41 np0005540741 systemd[1]: libpod-b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df.scope: Deactivated successfully.
Dec  1 04:32:41 np0005540741 podman[251364]: 2025-12-01 09:32:41.719471724 +0000 UTC m=+0.264106771 container attach b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_galileo, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2)
Dec  1 04:32:41 np0005540741 podman[251364]: 2025-12-01 09:32:41.720570386 +0000 UTC m=+0.265205393 container died b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_galileo, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:32:41 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:32:41 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:32:41 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:32:41 np0005540741 systemd[1]: var-lib-containers-storage-overlay-6eb702f5ef26153352c77e49a46a32f0a382234047422cda2793bd06417e00e3-merged.mount: Deactivated successfully.
Dec  1 04:32:41 np0005540741 podman[251364]: 2025-12-01 09:32:41.824589129 +0000 UTC m=+0.369224116 container remove b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_galileo, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  1 04:32:41 np0005540741 systemd[1]: libpod-conmon-b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df.scope: Deactivated successfully.
Dec  1 04:32:42 np0005540741 podman[251406]: 2025-12-01 09:32:42.020071314 +0000 UTC m=+0.046750486 container create 31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_lederberg, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  1 04:32:42 np0005540741 systemd[1]: Started libpod-conmon-31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1.scope.
Dec  1 04:32:42 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:32:42 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db5e5010db24d6ad744a9dfbe556db4b62302db2ef5d62129a33f418899eecfd/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:42 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db5e5010db24d6ad744a9dfbe556db4b62302db2ef5d62129a33f418899eecfd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:42 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db5e5010db24d6ad744a9dfbe556db4b62302db2ef5d62129a33f418899eecfd/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:42 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db5e5010db24d6ad744a9dfbe556db4b62302db2ef5d62129a33f418899eecfd/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:42 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db5e5010db24d6ad744a9dfbe556db4b62302db2ef5d62129a33f418899eecfd/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:42 np0005540741 podman[251406]: 2025-12-01 09:32:42.004086764 +0000 UTC m=+0.030765966 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:32:42 np0005540741 podman[251406]: 2025-12-01 09:32:42.118181447 +0000 UTC m=+0.144860649 container init 31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_lederberg, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:32:42 np0005540741 podman[251406]: 2025-12-01 09:32:42.133997752 +0000 UTC m=+0.160676934 container start 31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_lederberg, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS)
Dec  1 04:32:42 np0005540741 podman[251406]: 2025-12-01 09:32:42.137580945 +0000 UTC m=+0.164260127 container attach 31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_lederberg, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:32:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:32:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:32:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:32:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:32:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:32:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:32:43 np0005540741 stoic_lederberg[251423]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:32:43 np0005540741 stoic_lederberg[251423]: --> relative data size: 1.0
Dec  1 04:32:43 np0005540741 stoic_lederberg[251423]: --> All data devices are unavailable
Dec  1 04:32:43 np0005540741 systemd[1]: libpod-31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1.scope: Deactivated successfully.
Dec  1 04:32:43 np0005540741 podman[251406]: 2025-12-01 09:32:43.125912265 +0000 UTC m=+1.152591447 container died 31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_lederberg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:32:43 np0005540741 systemd[1]: var-lib-containers-storage-overlay-db5e5010db24d6ad744a9dfbe556db4b62302db2ef5d62129a33f418899eecfd-merged.mount: Deactivated successfully.
Dec  1 04:32:43 np0005540741 podman[251406]: 2025-12-01 09:32:43.182467302 +0000 UTC m=+1.209146484 container remove 31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_lederberg, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef)
Dec  1 04:32:43 np0005540741 systemd[1]: libpod-conmon-31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1.scope: Deactivated successfully.
Dec  1 04:32:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v623: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:43 np0005540741 podman[251601]: 2025-12-01 09:32:43.895197071 +0000 UTC m=+0.043947856 container create 49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_margulis, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec  1 04:32:43 np0005540741 systemd[1]: Started libpod-conmon-49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1.scope.
Dec  1 04:32:43 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:32:43 np0005540741 podman[251601]: 2025-12-01 09:32:43.876271986 +0000 UTC m=+0.025022771 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:32:43 np0005540741 podman[251601]: 2025-12-01 09:32:43.975036178 +0000 UTC m=+0.123786953 container init 49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_margulis, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:32:43 np0005540741 podman[251601]: 2025-12-01 09:32:43.98065259 +0000 UTC m=+0.129403345 container start 49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_margulis, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec  1 04:32:43 np0005540741 hopeful_margulis[251617]: 167 167
Dec  1 04:32:43 np0005540741 systemd[1]: libpod-49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1.scope: Deactivated successfully.
Dec  1 04:32:43 np0005540741 podman[251601]: 2025-12-01 09:32:43.991346147 +0000 UTC m=+0.140096902 container attach 49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_margulis, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:32:43 np0005540741 podman[251601]: 2025-12-01 09:32:43.992886092 +0000 UTC m=+0.141636907 container died 49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_margulis, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  1 04:32:44 np0005540741 systemd[1]: var-lib-containers-storage-overlay-08f47cab1a6ffa97d8a1054339c770b2d90061e6478b9a4153ac1fd1e9fb4ed6-merged.mount: Deactivated successfully.
Dec  1 04:32:44 np0005540741 podman[251601]: 2025-12-01 09:32:44.030971568 +0000 UTC m=+0.179722333 container remove 49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_margulis, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3)
Dec  1 04:32:44 np0005540741 systemd[1]: libpod-conmon-49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1.scope: Deactivated successfully.
Dec  1 04:32:44 np0005540741 podman[251634]: 2025-12-01 09:32:44.106679316 +0000 UTC m=+0.070185501 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  1 04:32:44 np0005540741 podman[251661]: 2025-12-01 09:32:44.188614394 +0000 UTC m=+0.040010192 container create d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcclintock, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True)
Dec  1 04:32:44 np0005540741 systemd[1]: Started libpod-conmon-d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54.scope.
Dec  1 04:32:44 np0005540741 podman[251661]: 2025-12-01 09:32:44.171833441 +0000 UTC m=+0.023229259 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:32:44 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:32:44 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06f7d588e0174f3d0d3e3e9617576f28b5b894b1ad05ed787b0cd126a800820b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:44 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06f7d588e0174f3d0d3e3e9617576f28b5b894b1ad05ed787b0cd126a800820b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:44 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06f7d588e0174f3d0d3e3e9617576f28b5b894b1ad05ed787b0cd126a800820b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:44 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06f7d588e0174f3d0d3e3e9617576f28b5b894b1ad05ed787b0cd126a800820b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:44 np0005540741 podman[251661]: 2025-12-01 09:32:44.297550339 +0000 UTC m=+0.148946187 container init d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcclintock, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2)
Dec  1 04:32:44 np0005540741 podman[251661]: 2025-12-01 09:32:44.303797558 +0000 UTC m=+0.155193396 container start d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcclintock, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  1 04:32:44 np0005540741 podman[251661]: 2025-12-01 09:32:44.308867024 +0000 UTC m=+0.160262852 container attach d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcclintock, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]: {
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:    "0": [
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:        {
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "devices": [
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "/dev/loop3"
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            ],
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "lv_name": "ceph_lv0",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "lv_size": "21470642176",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "name": "ceph_lv0",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "tags": {
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.cluster_name": "ceph",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.crush_device_class": "",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.encrypted": "0",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.osd_id": "0",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.type": "block",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.vdo": "0"
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            },
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "type": "block",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "vg_name": "ceph_vg0"
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:        }
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:    ],
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:    "1": [
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:        {
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "devices": [
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "/dev/loop4"
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            ],
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "lv_name": "ceph_lv1",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "lv_size": "21470642176",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "name": "ceph_lv1",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "tags": {
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.cluster_name": "ceph",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.crush_device_class": "",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.encrypted": "0",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.osd_id": "1",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.type": "block",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.vdo": "0"
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            },
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "type": "block",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "vg_name": "ceph_vg1"
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:        }
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:    ],
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:    "2": [
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:        {
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "devices": [
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "/dev/loop5"
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            ],
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "lv_name": "ceph_lv2",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "lv_size": "21470642176",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "name": "ceph_lv2",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "tags": {
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.cluster_name": "ceph",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.crush_device_class": "",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.encrypted": "0",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.osd_id": "2",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.type": "block",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:                "ceph.vdo": "0"
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            },
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "type": "block",
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:            "vg_name": "ceph_vg2"
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:        }
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]:    ]
Dec  1 04:32:45 np0005540741 priceless_mcclintock[251678]: }
Dec  1 04:32:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:32:45 np0005540741 systemd[1]: libpod-d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54.scope: Deactivated successfully.
Dec  1 04:32:45 np0005540741 podman[251661]: 2025-12-01 09:32:45.129862759 +0000 UTC m=+0.981258557 container died d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcclintock, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:32:45 np0005540741 systemd[1]: var-lib-containers-storage-overlay-06f7d588e0174f3d0d3e3e9617576f28b5b894b1ad05ed787b0cd126a800820b-merged.mount: Deactivated successfully.
Dec  1 04:32:45 np0005540741 podman[251661]: 2025-12-01 09:32:45.180077474 +0000 UTC m=+1.031473272 container remove d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcclintock, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  1 04:32:45 np0005540741 systemd[1]: libpod-conmon-d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54.scope: Deactivated successfully.
Dec  1 04:32:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v624: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:45 np0005540741 podman[251835]: 2025-12-01 09:32:45.762230785 +0000 UTC m=+0.046455647 container create eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_wright, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:32:45 np0005540741 systemd[1]: Started libpod-conmon-eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf.scope.
Dec  1 04:32:45 np0005540741 podman[251835]: 2025-12-01 09:32:45.738087061 +0000 UTC m=+0.022311953 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:32:45 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:32:45 np0005540741 podman[251835]: 2025-12-01 09:32:45.850981509 +0000 UTC m=+0.135206351 container init eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_wright, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:32:45 np0005540741 podman[251835]: 2025-12-01 09:32:45.857556678 +0000 UTC m=+0.141781520 container start eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_wright, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:32:45 np0005540741 podman[251835]: 2025-12-01 09:32:45.860596516 +0000 UTC m=+0.144821378 container attach eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_wright, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:32:45 np0005540741 hardcore_wright[251852]: 167 167
Dec  1 04:32:45 np0005540741 systemd[1]: libpod-eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf.scope: Deactivated successfully.
Dec  1 04:32:45 np0005540741 podman[251835]: 2025-12-01 09:32:45.863997304 +0000 UTC m=+0.148222166 container died eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_wright, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Dec  1 04:32:45 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b9afd6fd5e072bf344966fd77833dabc81109e24757ce05368bd292c92c40138-merged.mount: Deactivated successfully.
Dec  1 04:32:45 np0005540741 podman[251835]: 2025-12-01 09:32:45.9031497 +0000 UTC m=+0.187374572 container remove eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_wright, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:32:45 np0005540741 systemd[1]: libpod-conmon-eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf.scope: Deactivated successfully.
Dec  1 04:32:46 np0005540741 podman[251876]: 2025-12-01 09:32:46.102513057 +0000 UTC m=+0.047384244 container create d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_greider, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:32:46 np0005540741 systemd[1]: Started libpod-conmon-d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400.scope.
Dec  1 04:32:46 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:32:46 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4df32bb0f4da53e35621a2cbc93460ba62c3c08bda2d6c971365066d7bd5cd08/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:46 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4df32bb0f4da53e35621a2cbc93460ba62c3c08bda2d6c971365066d7bd5cd08/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:46 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4df32bb0f4da53e35621a2cbc93460ba62c3c08bda2d6c971365066d7bd5cd08/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:46 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4df32bb0f4da53e35621a2cbc93460ba62c3c08bda2d6c971365066d7bd5cd08/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:32:46 np0005540741 podman[251876]: 2025-12-01 09:32:46.082874862 +0000 UTC m=+0.027746109 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:32:46 np0005540741 podman[251876]: 2025-12-01 09:32:46.192714733 +0000 UTC m=+0.137585960 container init d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_greider, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2)
Dec  1 04:32:46 np0005540741 podman[251876]: 2025-12-01 09:32:46.201782744 +0000 UTC m=+0.146653931 container start d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_greider, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef)
Dec  1 04:32:46 np0005540741 podman[251876]: 2025-12-01 09:32:46.205105069 +0000 UTC m=+0.149976286 container attach d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_greider, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Dec  1 04:32:47 np0005540741 gallant_greider[251892]: {
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:        "osd_id": 0,
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:        "type": "bluestore"
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:    },
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:        "osd_id": 1,
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:        "type": "bluestore"
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:    },
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:        "osd_id": 2,
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:        "type": "bluestore"
Dec  1 04:32:47 np0005540741 gallant_greider[251892]:    }
Dec  1 04:32:47 np0005540741 gallant_greider[251892]: }
Dec  1 04:32:47 np0005540741 systemd[1]: libpod-d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400.scope: Deactivated successfully.
Dec  1 04:32:47 np0005540741 systemd[1]: libpod-d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400.scope: Consumed 1.008s CPU time.
Dec  1 04:32:47 np0005540741 podman[251876]: 2025-12-01 09:32:47.206810103 +0000 UTC m=+1.151681300 container died d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_greider, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec  1 04:32:47 np0005540741 systemd[1]: var-lib-containers-storage-overlay-4df32bb0f4da53e35621a2cbc93460ba62c3c08bda2d6c971365066d7bd5cd08-merged.mount: Deactivated successfully.
Dec  1 04:32:47 np0005540741 podman[251876]: 2025-12-01 09:32:47.265563034 +0000 UTC m=+1.210434221 container remove d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_greider, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:32:47 np0005540741 systemd[1]: libpod-conmon-d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400.scope: Deactivated successfully.
Dec  1 04:32:47 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:32:47 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:32:47 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:32:47 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:32:47 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 628a28e3-921c-4801-990e-fce6753519ab does not exist
Dec  1 04:32:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v625: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:48 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:32:48 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:32:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v626: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:49 np0005540741 nova_compute[250706]: 2025-12-01 09:32:49.864 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:32:49 np0005540741 nova_compute[250706]: 2025-12-01 09:32:49.971 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:32:50 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:32:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v627: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v628: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:54 np0005540741 podman[251987]: 2025-12-01 09:32:54.042806009 +0000 UTC m=+0.139385491 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:32:54 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec  1 04:32:54 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/433414139' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 04:32:54 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec  1 04:32:54 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/433414139' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 04:32:54 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec  1 04:32:54 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/881557011' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 04:32:54 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec  1 04:32:54 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/881557011' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 04:32:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:32:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec  1 04:32:55 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2308178662' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 04:32:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec  1 04:32:55 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2308178662' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 04:32:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v629: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v630: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:32:57 np0005540741 podman[252013]: 2025-12-01 09:32:57.955497528 +0000 UTC m=+0.061976234 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 04:32:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v631: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:00 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:33:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v632: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v633: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:05 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:33:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v634: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v635: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v636: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:10 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:33:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v637: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:33:13
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['.mgr', 'cephfs.cephfs.meta', 'backups', 'images', 'cephfs.cephfs.data', 'vms', 'volumes']
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:33:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v638: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:14 np0005540741 podman[252033]: 2025-12-01 09:33:14.96009004 +0000 UTC m=+0.061470330 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 04:33:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:33:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v639: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v640: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:33:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:33:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:33:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:33:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:33:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:33:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:33:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:33:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:33:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:33:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:33:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:33:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:33:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:33:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:33:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v641: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:33:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:33:20.468 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:33:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:33:20.469 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:33:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:33:20.469 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:33:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v642: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v643: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:25 np0005540741 podman[252053]: 2025-12-01 09:33:25.024765403 +0000 UTC m=+0.120177420 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  1 04:33:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:33:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v644: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.054 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.055 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.055 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.055 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.077 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.077 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.078 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.078 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.078 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.079 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.079 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.079 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.079 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.115 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.115 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.116 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.117 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.117 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:33:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec  1 04:33:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/244408483' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 04:33:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v645: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.616 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.792 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.794 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5326MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.794 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:33:27 np0005540741 nova_compute[250706]: 2025-12-01 09:33:27.794 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:33:28 np0005540741 nova_compute[250706]: 2025-12-01 09:33:28.022 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 04:33:28 np0005540741 nova_compute[250706]: 2025-12-01 09:33:28.022 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 04:33:28 np0005540741 nova_compute[250706]: 2025-12-01 09:33:28.062 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:33:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec  1 04:33:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1698179103' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 04:33:28 np0005540741 nova_compute[250706]: 2025-12-01 09:33:28.801 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.739s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:33:28 np0005540741 nova_compute[250706]: 2025-12-01 09:33:28.809 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 04:33:28 np0005540741 nova_compute[250706]: 2025-12-01 09:33:28.835 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 04:33:28 np0005540741 nova_compute[250706]: 2025-12-01 09:33:28.837 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 04:33:28 np0005540741 nova_compute[250706]: 2025-12-01 09:33:28.837 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:33:28 np0005540741 podman[252123]: 2025-12-01 09:33:28.963546412 +0000 UTC m=+0.065084014 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  1 04:33:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v646: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:33:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v647: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v648: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:35 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:33:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v649: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v650: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v651: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #33. Immutable memtables: 0.
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.206612) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 33
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581620206689, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 2023, "num_deletes": 505, "total_data_size": 1939274, "memory_usage": 1977720, "flush_reason": "Manual Compaction"}
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #34: started
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581620245115, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 34, "file_size": 1891888, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12009, "largest_seqno": 14031, "table_properties": {"data_size": 1883212, "index_size": 4854, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2757, "raw_key_size": 19951, "raw_average_key_size": 18, "raw_value_size": 1863871, "raw_average_value_size": 1732, "num_data_blocks": 224, "num_entries": 1076, "num_filter_entries": 1076, "num_deletions": 505, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764581425, "oldest_key_time": 1764581425, "file_creation_time": 1764581620, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 34, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 38563 microseconds, and 7801 cpu microseconds.
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.245187) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #34: 1891888 bytes OK
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.245218) [db/memtable_list.cc:519] [default] Level-0 commit table #34 started
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.247038) [db/memtable_list.cc:722] [default] Level-0 commit table #34: memtable #1 done
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.247054) EVENT_LOG_v1 {"time_micros": 1764581620247048, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.247078) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 1929720, prev total WAL file size 1946472, number of live WAL files 2.
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000030.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.247815) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323531' seq:0, type:0; will stop at (end)
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [34(1847KB)], [32(4549KB)]
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581620247875, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [34], "files_L6": [32], "score": -1, "input_data_size": 6550343, "oldest_snapshot_seqno": -1}
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #35: 3252 keys, 5134531 bytes, temperature: kUnknown
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581620301903, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 35, "file_size": 5134531, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5110952, "index_size": 14318, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8197, "raw_key_size": 77195, "raw_average_key_size": 23, "raw_value_size": 5050647, "raw_average_value_size": 1553, "num_data_blocks": 622, "num_entries": 3252, "num_filter_entries": 3252, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764581620, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.302151) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 5134531 bytes
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.303653) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 121.1 rd, 94.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 4.4 +0.0 blob) out(4.9 +0.0 blob), read-write-amplify(6.2) write-amplify(2.7) OK, records in: 4275, records dropped: 1023 output_compression: NoCompression
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.303671) EVENT_LOG_v1 {"time_micros": 1764581620303662, "job": 14, "event": "compaction_finished", "compaction_time_micros": 54107, "compaction_time_cpu_micros": 33168, "output_level": 6, "num_output_files": 1, "total_output_size": 5134531, "num_input_records": 4275, "num_output_records": 3252, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000034.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581620304125, "job": 14, "event": "table_file_deletion", "file_number": 34}
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581620305233, "job": 14, "event": "table_file_deletion", "file_number": 32}
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.247682) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.305376) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.305385) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.305388) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.305390) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:33:40 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.305392) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:33:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v652: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:33:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:33:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:33:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:33:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:33:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:33:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v653: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:33:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v654: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:45 np0005540741 podman[252142]: 2025-12-01 09:33:45.98275966 +0000 UTC m=+0.076912664 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3)
Dec  1 04:33:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v655: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "version", "format": "json"} v 0) v1
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3385313253' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:33:48 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14328 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec  1 04:33:48 np0005540741 ceph-mgr[75324]: [volumes INFO volumes.module] Starting _cmd_fs_volume_ls(format:json, prefix:fs volume ls) < ""
Dec  1 04:33:48 np0005540741 ceph-mgr[75324]: [volumes INFO volumes.module] Finishing _cmd_fs_volume_ls(format:json, prefix:fs volume ls) < ""
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:33:48 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev cf6f30bc-62b1-4f39-9170-8428315dad41 does not exist
Dec  1 04:33:48 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 922345fe-cd68-47fa-8d00-1c460b97c3ad does not exist
Dec  1 04:33:48 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 13e72bea-954e-4288-8864-bc0c63ae10b8 does not exist
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:33:48 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:33:49 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:33:49 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:33:49 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:33:49 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:33:49 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:33:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v656: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:49 np0005540741 podman[252552]: 2025-12-01 09:33:49.609089119 +0000 UTC m=+0.067902795 container create 207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_almeida, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS)
Dec  1 04:33:49 np0005540741 systemd[1]: Started libpod-conmon-207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1.scope.
Dec  1 04:33:49 np0005540741 podman[252552]: 2025-12-01 09:33:49.581816914 +0000 UTC m=+0.040630670 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:33:49 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:33:49 np0005540741 podman[252552]: 2025-12-01 09:33:49.712226847 +0000 UTC m=+0.171040553 container init 207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_almeida, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec  1 04:33:49 np0005540741 podman[252552]: 2025-12-01 09:33:49.721345179 +0000 UTC m=+0.180158885 container start 207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_almeida, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:33:49 np0005540741 podman[252552]: 2025-12-01 09:33:49.72659771 +0000 UTC m=+0.185411426 container attach 207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_almeida, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:33:49 np0005540741 gallant_almeida[252568]: 167 167
Dec  1 04:33:49 np0005540741 systemd[1]: libpod-207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1.scope: Deactivated successfully.
Dec  1 04:33:49 np0005540741 conmon[252568]: conmon 207ee24a86bd3dee9c7d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1.scope/container/memory.events
Dec  1 04:33:49 np0005540741 podman[252573]: 2025-12-01 09:33:49.773347846 +0000 UTC m=+0.026232796 container died 207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_almeida, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Dec  1 04:33:49 np0005540741 systemd[1]: var-lib-containers-storage-overlay-1cfcfafe77cc15f66767b2972a25715046507fc3aa71dc0698102596c9854407-merged.mount: Deactivated successfully.
Dec  1 04:33:49 np0005540741 podman[252573]: 2025-12-01 09:33:49.812862063 +0000 UTC m=+0.065746993 container remove 207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_almeida, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:33:49 np0005540741 systemd[1]: libpod-conmon-207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1.scope: Deactivated successfully.
Dec  1 04:33:49 np0005540741 podman[252595]: 2025-12-01 09:33:49.971331873 +0000 UTC m=+0.039432646 container create f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_yalow, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:33:50 np0005540741 systemd[1]: Started libpod-conmon-f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217.scope.
Dec  1 04:33:50 np0005540741 podman[252595]: 2025-12-01 09:33:49.952401668 +0000 UTC m=+0.020502491 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:33:50 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:33:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0081046c4980198874de97eb691a33c678aa58c11b36ebf7ef3ba3c82e0aada0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:33:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0081046c4980198874de97eb691a33c678aa58c11b36ebf7ef3ba3c82e0aada0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:33:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0081046c4980198874de97eb691a33c678aa58c11b36ebf7ef3ba3c82e0aada0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:33:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0081046c4980198874de97eb691a33c678aa58c11b36ebf7ef3ba3c82e0aada0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:33:50 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0081046c4980198874de97eb691a33c678aa58c11b36ebf7ef3ba3c82e0aada0/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:33:50 np0005540741 podman[252595]: 2025-12-01 09:33:50.078074594 +0000 UTC m=+0.146175467 container init f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_yalow, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:33:50 np0005540741 podman[252595]: 2025-12-01 09:33:50.091178351 +0000 UTC m=+0.159279134 container start f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_yalow, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:33:50 np0005540741 podman[252595]: 2025-12-01 09:33:50.095236338 +0000 UTC m=+0.163337161 container attach f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_yalow, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:33:50 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:33:51 np0005540741 gifted_yalow[252612]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:33:51 np0005540741 gifted_yalow[252612]: --> relative data size: 1.0
Dec  1 04:33:51 np0005540741 gifted_yalow[252612]: --> All data devices are unavailable
Dec  1 04:33:51 np0005540741 systemd[1]: libpod-f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217.scope: Deactivated successfully.
Dec  1 04:33:51 np0005540741 systemd[1]: libpod-f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217.scope: Consumed 1.011s CPU time.
Dec  1 04:33:51 np0005540741 podman[252595]: 2025-12-01 09:33:51.158245907 +0000 UTC m=+1.226346720 container died f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_yalow, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  1 04:33:51 np0005540741 systemd[1]: var-lib-containers-storage-overlay-0081046c4980198874de97eb691a33c678aa58c11b36ebf7ef3ba3c82e0aada0-merged.mount: Deactivated successfully.
Dec  1 04:33:51 np0005540741 podman[252595]: 2025-12-01 09:33:51.432911469 +0000 UTC m=+1.501012252 container remove f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_yalow, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec  1 04:33:51 np0005540741 systemd[1]: libpod-conmon-f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217.scope: Deactivated successfully.
Dec  1 04:33:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v657: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:33:52 np0005540741 podman[252794]: 2025-12-01 09:33:52.069865868 +0000 UTC m=+0.050013950 container create 2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_beaver, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Dec  1 04:33:52 np0005540741 systemd[1]: Started libpod-conmon-2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297.scope.
Dec  1 04:33:52 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:33:52 np0005540741 podman[252794]: 2025-12-01 09:33:52.048057791 +0000 UTC m=+0.028205883 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:33:52 np0005540741 podman[252794]: 2025-12-01 09:33:52.15159052 +0000 UTC m=+0.131738722 container init 2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_beaver, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  1 04:33:52 np0005540741 podman[252794]: 2025-12-01 09:33:52.160244939 +0000 UTC m=+0.140393021 container start 2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_beaver, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:33:52 np0005540741 podman[252794]: 2025-12-01 09:33:52.163924075 +0000 UTC m=+0.144072257 container attach 2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_beaver, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 04:33:52 np0005540741 confident_beaver[252810]: 167 167
Dec  1 04:33:52 np0005540741 systemd[1]: libpod-2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297.scope: Deactivated successfully.
Dec  1 04:33:52 np0005540741 podman[252794]: 2025-12-01 09:33:52.167122327 +0000 UTC m=+0.147270429 container died 2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_beaver, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:33:52 np0005540741 systemd[1]: var-lib-containers-storage-overlay-5132fb86fd37443add15ceb550d92f191cd0ab8b857ef0c4bdb2b92a2a6d15f5-merged.mount: Deactivated successfully.
Dec  1 04:33:52 np0005540741 podman[252794]: 2025-12-01 09:33:52.214470069 +0000 UTC m=+0.194618191 container remove 2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_beaver, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec  1 04:33:52 np0005540741 systemd[1]: libpod-conmon-2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297.scope: Deactivated successfully.
Dec  1 04:33:52 np0005540741 podman[252833]: 2025-12-01 09:33:52.458502131 +0000 UTC m=+0.062893580 container create 8f2b80f71b3e039e47c7b3b35451f849b24a5e4e7c39a3cf856e6acdb45f8dae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_hugle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:33:52 np0005540741 systemd[1]: Started libpod-conmon-8f2b80f71b3e039e47c7b3b35451f849b24a5e4e7c39a3cf856e6acdb45f8dae.scope.
Dec  1 04:33:52 np0005540741 podman[252833]: 2025-12-01 09:33:52.435800038 +0000 UTC m=+0.040191487 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:33:52 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:33:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8275c0984f5c1028e49633f9cd468d5738fe6ba121c8f76b5d9885c17522e761/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:33:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8275c0984f5c1028e49633f9cd468d5738fe6ba121c8f76b5d9885c17522e761/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:33:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8275c0984f5c1028e49633f9cd468d5738fe6ba121c8f76b5d9885c17522e761/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:33:52 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8275c0984f5c1028e49633f9cd468d5738fe6ba121c8f76b5d9885c17522e761/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:33:52 np0005540741 podman[252833]: 2025-12-01 09:33:52.554109732 +0000 UTC m=+0.158501161 container init 8f2b80f71b3e039e47c7b3b35451f849b24a5e4e7c39a3cf856e6acdb45f8dae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_hugle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:33:52 np0005540741 podman[252833]: 2025-12-01 09:33:52.561224157 +0000 UTC m=+0.165615586 container start 8f2b80f71b3e039e47c7b3b35451f849b24a5e4e7c39a3cf856e6acdb45f8dae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_hugle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec  1 04:33:52 np0005540741 podman[252833]: 2025-12-01 09:33:52.564483861 +0000 UTC m=+0.168875320 container attach 8f2b80f71b3e039e47c7b3b35451f849b24a5e4e7c39a3cf856e6acdb45f8dae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_hugle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]: {
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:    "0": [
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:        {
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:            "devices": [
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:                "/dev/loop3"
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:            ],
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:            "lv_name": "ceph_lv0",
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:            "lv_size": "21470642176",
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:            "name": "ceph_lv0",
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:            "tags": {
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:                "ceph.cluster_name": "ceph",
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:                "ceph.crush_device_class": "",
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:                "ceph.encrypted": "0",
Dec  1 04:33:53 np0005540741 heuristic_hugle[252849]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:36:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v751: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:36:50 np0005540741 rsyslogd[1007]: imjournal: 1311 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec  1 04:36:50 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e52 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:36:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v752: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:36:51 np0005540741 podman[255439]: 2025-12-01 09:36:51.977892242 +0000 UTC m=+0.068140272 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec  1 04:36:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v753: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:36:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e52 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:36:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v754: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:36:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e52 do_prune osdmap full prune enabled
Dec  1 04:36:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e53 e53: 3 total, 3 up, 3 in
Dec  1 04:36:56 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e53: 3 total, 3 up, 3 in
Dec  1 04:36:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v756: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 102 B/s wr, 1 op/s
Dec  1 04:36:59 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e53 do_prune osdmap full prune enabled
Dec  1 04:36:59 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e54 e54: 3 total, 3 up, 3 in
Dec  1 04:36:59 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e54: 3 total, 3 up, 3 in
Dec  1 04:36:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v758: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 127 B/s wr, 2 op/s
Dec  1 04:37:00 np0005540741 podman[255460]: 2025-12-01 09:37:00.016971917 +0000 UTC m=+0.117157642 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 04:37:00 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:37:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e54 do_prune osdmap full prune enabled
Dec  1 04:37:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e55 e55: 3 total, 3 up, 3 in
Dec  1 04:37:01 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e55: 3 total, 3 up, 3 in
Dec  1 04:37:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v760: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 54 KiB/s rd, 5.3 KiB/s wr, 74 op/s
Dec  1 04:37:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e55 do_prune osdmap full prune enabled
Dec  1 04:37:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e56 e56: 3 total, 3 up, 3 in
Dec  1 04:37:03 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e56: 3 total, 3 up, 3 in
Dec  1 04:37:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v762: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 55 KiB/s rd, 5.2 KiB/s wr, 74 op/s
Dec  1 04:37:04 np0005540741 podman[255487]: 2025-12-01 09:37:04.980517334 +0000 UTC m=+0.079075827 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Dec  1 04:37:05 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e56 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:37:05 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e56 do_prune osdmap full prune enabled
Dec  1 04:37:05 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e57 e57: 3 total, 3 up, 3 in
Dec  1 04:37:05 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e57: 3 total, 3 up, 3 in
Dec  1 04:37:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v764: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 125 KiB/s rd, 11 KiB/s wr, 170 op/s
Dec  1 04:37:07 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e57 do_prune osdmap full prune enabled
Dec  1 04:37:07 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e58 e58: 3 total, 3 up, 3 in
Dec  1 04:37:07 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e58: 3 total, 3 up, 3 in
Dec  1 04:37:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v766: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 98 KiB/s rd, 11 KiB/s wr, 139 op/s
Dec  1 04:37:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e58 do_prune osdmap full prune enabled
Dec  1 04:37:08 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e59 e59: 3 total, 3 up, 3 in
Dec  1 04:37:08 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e59: 3 total, 3 up, 3 in
Dec  1 04:37:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e59 do_prune osdmap full prune enabled
Dec  1 04:37:09 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e60 e60: 3 total, 3 up, 3 in
Dec  1 04:37:09 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e60: 3 total, 3 up, 3 in
Dec  1 04:37:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v769: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 7.4 KiB/s wr, 57 op/s
Dec  1 04:37:10 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:37:10 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e60 do_prune osdmap full prune enabled
Dec  1 04:37:10 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e61 e61: 3 total, 3 up, 3 in
Dec  1 04:37:10 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e61: 3 total, 3 up, 3 in
Dec  1 04:37:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e61 do_prune osdmap full prune enabled
Dec  1 04:37:11 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e62 e62: 3 total, 3 up, 3 in
Dec  1 04:37:11 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e62: 3 total, 3 up, 3 in
Dec  1 04:37:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v772: 193 pgs: 193 active+clean; 105 MiB data, 186 MiB used, 60 GiB / 60 GiB avail; 125 KiB/s rd, 16 MiB/s wr, 172 op/s
Dec  1 04:37:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e62 do_prune osdmap full prune enabled
Dec  1 04:37:12 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e63 e63: 3 total, 3 up, 3 in
Dec  1 04:37:12 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e63: 3 total, 3 up, 3 in
Dec  1 04:37:12 np0005540741 podman[255681]: 2025-12-01 09:37:12.584662175 +0000 UTC m=+0.081588508 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True)
Dec  1 04:37:12 np0005540741 podman[255681]: 2025-12-01 09:37:12.70991858 +0000 UTC m=+0.206844923 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:37:13
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'cephfs.cephfs.meta', 'vms', 'images', 'volumes', '.mgr', 'cephfs.cephfs.data']
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:37:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e63 do_prune osdmap full prune enabled
Dec  1 04:37:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e64 e64: 3 total, 3 up, 3 in
Dec  1 04:37:13 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e64: 3 total, 3 up, 3 in
Dec  1 04:37:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:37:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:37:13 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:37:13 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:37:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v775: 193 pgs: 193 active+clean; 105 MiB data, 186 MiB used, 60 GiB / 60 GiB avail; 234 KiB/s rd, 28 MiB/s wr, 333 op/s
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:37:14 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 7fd36b44-d0b6-42f0-a75d-245c120b3058 does not exist
Dec  1 04:37:14 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 94a9b564-905c-46a4-9bda-44359844445a does not exist
Dec  1 04:37:14 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 8c8a5006-de33-43f5-baaa-981aa2e9e3da does not exist
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e64 do_prune osdmap full prune enabled
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e65 e65: 3 total, 3 up, 3 in
Dec  1 04:37:14 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e65: 3 total, 3 up, 3 in
Dec  1 04:37:14 np0005540741 podman[256091]: 2025-12-01 09:37:14.773920482 +0000 UTC m=+0.054583761 container create def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_rubin, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec  1 04:37:14 np0005540741 systemd[1]: Started libpod-conmon-def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e.scope.
Dec  1 04:37:14 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:37:14 np0005540741 podman[256091]: 2025-12-01 09:37:14.745532566 +0000 UTC m=+0.026195925 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:37:14 np0005540741 podman[256091]: 2025-12-01 09:37:14.85202234 +0000 UTC m=+0.132685629 container init def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_rubin, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef)
Dec  1 04:37:14 np0005540741 podman[256091]: 2025-12-01 09:37:14.859542226 +0000 UTC m=+0.140205495 container start def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_rubin, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:37:14 np0005540741 crazy_rubin[256107]: 167 167
Dec  1 04:37:14 np0005540741 podman[256091]: 2025-12-01 09:37:14.864238141 +0000 UTC m=+0.144901410 container attach def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_rubin, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec  1 04:37:14 np0005540741 systemd[1]: libpod-def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e.scope: Deactivated successfully.
Dec  1 04:37:14 np0005540741 podman[256091]: 2025-12-01 09:37:14.866105825 +0000 UTC m=+0.146769094 container died def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_rubin, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:37:14 np0005540741 systemd[1]: var-lib-containers-storage-overlay-5fbaab6f1aab82115def2925b4fa2bfe38057812eedd4c773d4c7eb957c71fd4-merged.mount: Deactivated successfully.
Dec  1 04:37:14 np0005540741 podman[256091]: 2025-12-01 09:37:14.905237111 +0000 UTC m=+0.185900370 container remove def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_rubin, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec  1 04:37:14 np0005540741 systemd[1]: libpod-conmon-def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e.scope: Deactivated successfully.
Dec  1 04:37:15 np0005540741 podman[256129]: 2025-12-01 09:37:15.087608759 +0000 UTC m=+0.039871228 container create ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_gauss, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec  1 04:37:15 np0005540741 systemd[1]: Started libpod-conmon-ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a.scope.
Dec  1 04:37:15 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:37:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/780fc2eadec1ee4711f5812ca6edb98d4090c3f4388eb318fcbf398af6518aa6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:37:15 np0005540741 podman[256129]: 2025-12-01 09:37:15.067735857 +0000 UTC m=+0.019998246 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:37:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/780fc2eadec1ee4711f5812ca6edb98d4090c3f4388eb318fcbf398af6518aa6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:37:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/780fc2eadec1ee4711f5812ca6edb98d4090c3f4388eb318fcbf398af6518aa6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:37:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/780fc2eadec1ee4711f5812ca6edb98d4090c3f4388eb318fcbf398af6518aa6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:37:15 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/780fc2eadec1ee4711f5812ca6edb98d4090c3f4388eb318fcbf398af6518aa6/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:37:15 np0005540741 podman[256129]: 2025-12-01 09:37:15.179419231 +0000 UTC m=+0.131681670 container init ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_gauss, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:37:15 np0005540741 podman[256129]: 2025-12-01 09:37:15.192610751 +0000 UTC m=+0.144873160 container start ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_gauss, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec  1 04:37:15 np0005540741 podman[256129]: 2025-12-01 09:37:15.196327977 +0000 UTC m=+0.148590416 container attach ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_gauss, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  1 04:37:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e65 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:37:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e65 do_prune osdmap full prune enabled
Dec  1 04:37:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e66 e66: 3 total, 3 up, 3 in
Dec  1 04:37:15 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e66: 3 total, 3 up, 3 in
Dec  1 04:37:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v778: 193 pgs: 193 active+clean; 41 MiB data, 126 MiB used, 60 GiB / 60 GiB avail; 213 KiB/s rd, 12 MiB/s wr, 305 op/s
Dec  1 04:37:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e66 do_prune osdmap full prune enabled
Dec  1 04:37:16 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e67 e67: 3 total, 3 up, 3 in
Dec  1 04:37:16 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e67: 3 total, 3 up, 3 in
Dec  1 04:37:16 np0005540741 naughty_gauss[256145]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:37:16 np0005540741 naughty_gauss[256145]: --> relative data size: 1.0
Dec  1 04:37:16 np0005540741 naughty_gauss[256145]: --> All data devices are unavailable
Dec  1 04:37:16 np0005540741 systemd[1]: libpod-ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a.scope: Deactivated successfully.
Dec  1 04:37:16 np0005540741 systemd[1]: libpod-ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a.scope: Consumed 1.177s CPU time.
Dec  1 04:37:16 np0005540741 podman[256129]: 2025-12-01 09:37:16.436760681 +0000 UTC m=+1.389023090 container died ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_gauss, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec  1 04:37:16 np0005540741 systemd[1]: var-lib-containers-storage-overlay-780fc2eadec1ee4711f5812ca6edb98d4090c3f4388eb318fcbf398af6518aa6-merged.mount: Deactivated successfully.
Dec  1 04:37:16 np0005540741 podman[256129]: 2025-12-01 09:37:16.504775588 +0000 UTC m=+1.457037967 container remove ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_gauss, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec  1 04:37:16 np0005540741 systemd[1]: libpod-conmon-ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a.scope: Deactivated successfully.
Dec  1 04:37:17 np0005540741 podman[256324]: 2025-12-01 09:37:17.192134007 +0000 UTC m=+0.069257704 container create a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_volhard, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:37:17 np0005540741 systemd[1]: Started libpod-conmon-a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab.scope.
Dec  1 04:37:17 np0005540741 podman[256324]: 2025-12-01 09:37:17.158410637 +0000 UTC m=+0.035534364 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:37:17 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:37:17 np0005540741 podman[256324]: 2025-12-01 09:37:17.27947908 +0000 UTC m=+0.156602817 container init a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_volhard, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef)
Dec  1 04:37:17 np0005540741 podman[256324]: 2025-12-01 09:37:17.285183555 +0000 UTC m=+0.162307272 container start a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_volhard, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:37:17 np0005540741 vigilant_volhard[256340]: 167 167
Dec  1 04:37:17 np0005540741 systemd[1]: libpod-a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab.scope: Deactivated successfully.
Dec  1 04:37:17 np0005540741 podman[256324]: 2025-12-01 09:37:17.28953963 +0000 UTC m=+0.166663347 container attach a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_volhard, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:37:17 np0005540741 podman[256324]: 2025-12-01 09:37:17.289824648 +0000 UTC m=+0.166948355 container died a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_volhard, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:37:17 np0005540741 systemd[1]: var-lib-containers-storage-overlay-5a932a4de8bea6e555a3ac17f20413f624c6c021ab30ad9f6aeaf8daf3d5d9d7-merged.mount: Deactivated successfully.
Dec  1 04:37:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e67 do_prune osdmap full prune enabled
Dec  1 04:37:17 np0005540741 podman[256324]: 2025-12-01 09:37:17.330791507 +0000 UTC m=+0.207915204 container remove a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_volhard, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec  1 04:37:17 np0005540741 systemd[1]: libpod-conmon-a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab.scope: Deactivated successfully.
Dec  1 04:37:17 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e68 e68: 3 total, 3 up, 3 in
Dec  1 04:37:17 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e68: 3 total, 3 up, 3 in
Dec  1 04:37:17 np0005540741 podman[256365]: 2025-12-01 09:37:17.519072495 +0000 UTC m=+0.023221719 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:37:17 np0005540741 podman[256365]: 2025-12-01 09:37:17.622151631 +0000 UTC m=+0.126300865 container create b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_taussig, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:37:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v781: 193 pgs: 193 active+clean; 41 MiB data, 130 MiB used, 60 GiB / 60 GiB avail; 195 KiB/s rd, 17 KiB/s wr, 268 op/s
Dec  1 04:37:17 np0005540741 systemd[1]: Started libpod-conmon-b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8.scope.
Dec  1 04:37:17 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:37:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae3a8a21a23d8773fcfc890b07dc550d22ab70c32a7f231b1353f336fdc1f74b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:37:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae3a8a21a23d8773fcfc890b07dc550d22ab70c32a7f231b1353f336fdc1f74b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:37:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae3a8a21a23d8773fcfc890b07dc550d22ab70c32a7f231b1353f336fdc1f74b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:37:17 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae3a8a21a23d8773fcfc890b07dc550d22ab70c32a7f231b1353f336fdc1f74b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:37:17 np0005540741 podman[256365]: 2025-12-01 09:37:17.734976048 +0000 UTC m=+0.239125342 container init b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_taussig, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:37:17 np0005540741 podman[256365]: 2025-12-01 09:37:17.746231652 +0000 UTC m=+0.250380856 container start b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_taussig, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec  1 04:37:17 np0005540741 podman[256365]: 2025-12-01 09:37:17.771913101 +0000 UTC m=+0.276062345 container attach b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_taussig, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec  1 04:37:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e68 do_prune osdmap full prune enabled
Dec  1 04:37:18 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e69 e69: 3 total, 3 up, 3 in
Dec  1 04:37:18 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e69: 3 total, 3 up, 3 in
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]: {
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:    "0": [
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:        {
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "devices": [
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "/dev/loop3"
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            ],
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "lv_name": "ceph_lv0",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "lv_size": "21470642176",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "name": "ceph_lv0",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "tags": {
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.cluster_name": "ceph",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.crush_device_class": "",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.encrypted": "0",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.osd_id": "0",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.type": "block",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.vdo": "0"
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            },
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "type": "block",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "vg_name": "ceph_vg0"
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:        }
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:    ],
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:    "1": [
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:        {
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "devices": [
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "/dev/loop4"
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            ],
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "lv_name": "ceph_lv1",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "lv_size": "21470642176",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "name": "ceph_lv1",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "tags": {
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.cluster_name": "ceph",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.crush_device_class": "",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.encrypted": "0",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.osd_id": "1",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.type": "block",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.vdo": "0"
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            },
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "type": "block",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "vg_name": "ceph_vg1"
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:        }
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:    ],
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:    "2": [
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:        {
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "devices": [
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "/dev/loop5"
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            ],
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "lv_name": "ceph_lv2",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "lv_size": "21470642176",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "name": "ceph_lv2",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "tags": {
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.cluster_name": "ceph",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.crush_device_class": "",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.encrypted": "0",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.osd_id": "2",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.type": "block",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:                "ceph.vdo": "0"
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            },
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "type": "block",
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:            "vg_name": "ceph_vg2"
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:        }
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]:    ]
Dec  1 04:37:18 np0005540741 romantic_taussig[256381]: }
Dec  1 04:37:18 np0005540741 nova_compute[250706]: 2025-12-01 09:37:18.559 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "6740b382-574d-4ced-a156-11a531b94114" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:18 np0005540741 nova_compute[250706]: 2025-12-01 09:37:18.561 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "6740b382-574d-4ced-a156-11a531b94114" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:18 np0005540741 nova_compute[250706]: 2025-12-01 09:37:18.581 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  1 04:37:18 np0005540741 systemd[1]: libpod-b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8.scope: Deactivated successfully.
Dec  1 04:37:18 np0005540741 podman[256365]: 2025-12-01 09:37:18.588282192 +0000 UTC m=+1.092431396 container died b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_taussig, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec  1 04:37:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:37:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:37:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:37:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:37:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:37:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:37:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 1.9077212346161359e-07 of space, bias 1.0, pg target 5.723163703848408e-05 quantized to 32 (current 32)
Dec  1 04:37:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:37:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:37:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:37:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006672572971609038 of space, bias 1.0, pg target 0.20017718914827115 quantized to 32 (current 32)
Dec  1 04:37:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:37:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:37:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:37:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:37:18 np0005540741 systemd[1]: var-lib-containers-storage-overlay-ae3a8a21a23d8773fcfc890b07dc550d22ab70c32a7f231b1353f336fdc1f74b-merged.mount: Deactivated successfully.
Dec  1 04:37:18 np0005540741 podman[256365]: 2025-12-01 09:37:18.647261339 +0000 UTC m=+1.151410543 container remove b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_taussig, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:37:18 np0005540741 systemd[1]: libpod-conmon-b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8.scope: Deactivated successfully.
Dec  1 04:37:18 np0005540741 nova_compute[250706]: 2025-12-01 09:37:18.689 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:18 np0005540741 nova_compute[250706]: 2025-12-01 09:37:18.690 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:18 np0005540741 nova_compute[250706]: 2025-12-01 09:37:18.698 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  1 04:37:18 np0005540741 nova_compute[250706]: 2025-12-01 09:37:18.698 250710 INFO nova.compute.claims [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  1 04:37:18 np0005540741 nova_compute[250706]: 2025-12-01 09:37:18.815 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec  1 04:37:19 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/149037888' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 04:37:19 np0005540741 nova_compute[250706]: 2025-12-01 09:37:19.252 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:19 np0005540741 nova_compute[250706]: 2025-12-01 09:37:19.259 250710 DEBUG nova.compute.provider_tree [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 04:37:19 np0005540741 nova_compute[250706]: 2025-12-01 09:37:19.292 250710 DEBUG nova.scheduler.client.report [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 04:37:19 np0005540741 nova_compute[250706]: 2025-12-01 09:37:19.318 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:37:19 np0005540741 nova_compute[250706]: 2025-12-01 09:37:19.319 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  1 04:37:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e69 do_prune osdmap full prune enabled
Dec  1 04:37:19 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e70 e70: 3 total, 3 up, 3 in
Dec  1 04:37:19 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e70: 3 total, 3 up, 3 in
Dec  1 04:37:19 np0005540741 nova_compute[250706]: 2025-12-01 09:37:19.368 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  1 04:37:19 np0005540741 nova_compute[250706]: 2025-12-01 09:37:19.369 250710 DEBUG nova.network.neutron [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  1 04:37:19 np0005540741 podman[256561]: 2025-12-01 09:37:19.36945535 +0000 UTC m=+0.063599811 container create 289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_goldstine, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec  1 04:37:19 np0005540741 nova_compute[250706]: 2025-12-01 09:37:19.397 250710 INFO nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  1 04:37:19 np0005540741 nova_compute[250706]: 2025-12-01 09:37:19.423 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  1 04:37:19 np0005540741 podman[256561]: 2025-12-01 09:37:19.339854018 +0000 UTC m=+0.033998509 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:37:19 np0005540741 systemd[1]: Started libpod-conmon-289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047.scope.
Dec  1 04:37:19 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:37:19 np0005540741 nova_compute[250706]: 2025-12-01 09:37:19.477 250710 INFO nova.virt.block_device [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Booting with volume 61c9bc29-5cbf-4816-a0ae-b24ddf88776c at /dev/vda#033[00m
Dec  1 04:37:19 np0005540741 podman[256561]: 2025-12-01 09:37:19.48484022 +0000 UTC m=+0.178984661 container init 289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_goldstine, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True)
Dec  1 04:37:19 np0005540741 podman[256561]: 2025-12-01 09:37:19.496324741 +0000 UTC m=+0.190469162 container start 289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_goldstine, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:37:19 np0005540741 podman[256561]: 2025-12-01 09:37:19.49875199 +0000 UTC m=+0.192896431 container attach 289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_goldstine, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 04:37:19 np0005540741 musing_goldstine[256577]: 167 167
Dec  1 04:37:19 np0005540741 systemd[1]: libpod-289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047.scope: Deactivated successfully.
Dec  1 04:37:19 np0005540741 podman[256561]: 2025-12-01 09:37:19.502967412 +0000 UTC m=+0.197111833 container died 289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_goldstine, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec  1 04:37:19 np0005540741 systemd[1]: var-lib-containers-storage-overlay-a1259d33cc225eafb375638b21ca8e7ecf04d38db0b9d7d11879af4abc34a161-merged.mount: Deactivated successfully.
Dec  1 04:37:19 np0005540741 podman[256561]: 2025-12-01 09:37:19.5432302 +0000 UTC m=+0.237374621 container remove 289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_goldstine, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:37:19 np0005540741 systemd[1]: libpod-conmon-289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047.scope: Deactivated successfully.
Dec  1 04:37:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v784: 193 pgs: 193 active+clean; 41 MiB data, 130 MiB used, 60 GiB / 60 GiB avail; 91 KiB/s rd, 7.5 KiB/s wr, 123 op/s
Dec  1 04:37:19 np0005540741 podman[256601]: 2025-12-01 09:37:19.737533732 +0000 UTC m=+0.062103119 container create d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_zhukovsky, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:37:19 np0005540741 systemd[1]: Started libpod-conmon-d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130.scope.
Dec  1 04:37:19 np0005540741 podman[256601]: 2025-12-01 09:37:19.712155741 +0000 UTC m=+0.036725218 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:37:19 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:37:19 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4bcb6f30d6ac7e44735a9bf19ceaaff7b0ff9b47c1e6362b7fbed9cdd91cb89/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:37:19 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4bcb6f30d6ac7e44735a9bf19ceaaff7b0ff9b47c1e6362b7fbed9cdd91cb89/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:37:19 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4bcb6f30d6ac7e44735a9bf19ceaaff7b0ff9b47c1e6362b7fbed9cdd91cb89/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:37:19 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4bcb6f30d6ac7e44735a9bf19ceaaff7b0ff9b47c1e6362b7fbed9cdd91cb89/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:37:19 np0005540741 podman[256601]: 2025-12-01 09:37:19.842993256 +0000 UTC m=+0.167562663 container init d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_zhukovsky, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Dec  1 04:37:19 np0005540741 podman[256601]: 2025-12-01 09:37:19.854914409 +0000 UTC m=+0.179483796 container start d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_zhukovsky, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec  1 04:37:19 np0005540741 podman[256601]: 2025-12-01 09:37:19.858499432 +0000 UTC m=+0.183068819 container attach d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_zhukovsky, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:37:19 np0005540741 nova_compute[250706]: 2025-12-01 09:37:19.978 250710 DEBUG os_brick.utils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.100', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-0.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Dec  1 04:37:19 np0005540741 nova_compute[250706]: 2025-12-01 09:37:19.984 250710 INFO oslo.privsep.daemon [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmp9do_h71r/privsep.sock']#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.126 250710 DEBUG nova.network.neutron [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.127 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  1 04:37:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e70 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:37:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e70 do_prune osdmap full prune enabled
Dec  1 04:37:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e71 e71: 3 total, 3 up, 3 in
Dec  1 04:37:20 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e71: 3 total, 3 up, 3 in
Dec  1 04:37:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:37:20.474 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:37:20.475 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:37:20.475 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.829 250710 INFO oslo.privsep.daemon [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Spawned new privsep daemon via rootwrap#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.692 256632 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.696 256632 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.698 256632 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.698 256632 INFO oslo.privsep.daemon [-] privsep daemon running as pid 256632#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.833 256632 DEBUG oslo.privsep.daemon [-] privsep: reply[db9da8ff-f118-4412-a8a0-fa71303549fc]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]: {
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:        "osd_id": 0,
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:        "type": "bluestore"
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:    },
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:        "osd_id": 1,
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:        "type": "bluestore"
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:    },
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:        "osd_id": 2,
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:        "type": "bluestore"
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]:    }
Dec  1 04:37:20 np0005540741 quizzical_zhukovsky[256618]: }
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.931 256632 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.945 256632 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.946 256632 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d93fb9-a643-4c8c-b079-d2447c5695d2]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.947 256632 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:20 np0005540741 systemd[1]: libpod-d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130.scope: Deactivated successfully.
Dec  1 04:37:20 np0005540741 podman[256601]: 2025-12-01 09:37:20.948553219 +0000 UTC m=+1.273122616 container died d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_zhukovsky, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:37:20 np0005540741 systemd[1]: libpod-d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130.scope: Consumed 1.087s CPU time.
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.957 256632 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.959 256632 DEBUG oslo.privsep.daemon [-] privsep: reply[02b9dbb7-b490-441a-8bc2-1d8b92c8aaad]: (4, ('InitiatorName=iqn.1994-05.com.redhat:44dd6092e7fe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.961 256632 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:20 np0005540741 systemd[1]: var-lib-containers-storage-overlay-d4bcb6f30d6ac7e44735a9bf19ceaaff7b0ff9b47c1e6362b7fbed9cdd91cb89-merged.mount: Deactivated successfully.
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.980 256632 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.980 256632 DEBUG oslo.privsep.daemon [-] privsep: reply[a3d225cb-38c3-4f3c-97f5-f7b22ef1f71d]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.983 256632 DEBUG oslo.privsep.daemon [-] privsep: reply[bffc43dc-ac5b-47dd-8247-e84f0c87e14b]: (4, '52310927-1d30-4bda-9d2b-fd9f7cfadc4d') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 04:37:20 np0005540741 nova_compute[250706]: 2025-12-01 09:37:20.983 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:21 np0005540741 podman[256601]: 2025-12-01 09:37:21.001962956 +0000 UTC m=+1.326532363 container remove d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_zhukovsky, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:37:21 np0005540741 nova_compute[250706]: 2025-12-01 09:37:21.007 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "nvme version" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:21 np0005540741 systemd[1]: libpod-conmon-d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130.scope: Deactivated successfully.
Dec  1 04:37:21 np0005540741 nova_compute[250706]: 2025-12-01 09:37:21.010 250710 DEBUG os_brick.initiator.connectors.lightos [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Dec  1 04:37:21 np0005540741 nova_compute[250706]: 2025-12-01 09:37:21.012 250710 DEBUG os_brick.initiator.connectors.lightos [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Dec  1 04:37:21 np0005540741 nova_compute[250706]: 2025-12-01 09:37:21.012 250710 DEBUG os_brick.initiator.connectors.lightos [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:a5e1c538-e584-4a91-80ec-e9f1f28a5aed dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Dec  1 04:37:21 np0005540741 nova_compute[250706]: 2025-12-01 09:37:21.013 250710 DEBUG os_brick.utils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] <== get_connector_properties: return (1030ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.100', 'host': 'compute-0.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:44dd6092e7fe', 'do_local_attach': False, 'nvme_hostid': 'a5e1c538-e584-4a91-80ec-e9f1f28a5aed', 'system uuid': '52310927-1d30-4bda-9d2b-fd9f7cfadc4d', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:a5e1c538-e584-4a91-80ec-e9f1f28a5aed', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Dec  1 04:37:21 np0005540741 nova_compute[250706]: 2025-12-01 09:37:21.013 250710 DEBUG nova.virt.block_device [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Updating existing volume attachment record: 48a15dd1-d08e-4947-98cd-2a9168bf85d9 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Dec  1 04:37:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:37:21 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:37:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:37:21 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:37:21 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 91f242c3-8997-4e22-8df6-7a888fc8ffd5 does not exist
Dec  1 04:37:21 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:37:21 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:37:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v786: 193 pgs: 2 active+clean+snaptrim, 8 active+clean+snaptrim_wait, 183 active+clean; 41 MiB data, 130 MiB used, 60 GiB / 60 GiB avail; 288 KiB/s rd, 25 KiB/s wr, 393 op/s
Dec  1 04:37:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Dec  1 04:37:21 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1389057543' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.482 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.484 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.484 250710 INFO nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Creating image(s)#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.485 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.485 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Ensure instance console log exists: /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.485 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.486 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.486 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.488 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'guest_format': None, 'attachment_id': '48a15dd1-d08e-4947-98cd-2a9168bf85d9', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-61c9bc29-5cbf-4816-a0ae-b24ddf88776c', 'hosts': ['192.168.122.100'], 'ports': ['6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '61c9bc29-5cbf-4816-a0ae-b24ddf88776c', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '6740b382-574d-4ced-a156-11a531b94114', 'attached_at': '', 'detached_at': '', 'volume_id': '61c9bc29-5cbf-4816-a0ae-b24ddf88776c', 'serial': '61c9bc29-5cbf-4816-a0ae-b24ddf88776c'}, 'device_type': 'disk', 'delete_on_termination': True, 'boot_index': 0, 'disk_bus': 'virtio', 'mount_device': '/dev/vda', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.494 250710 WARNING nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.501 250710 DEBUG nova.virt.libvirt.host [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.501 250710 DEBUG nova.virt.libvirt.host [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.504 250710 DEBUG nova.virt.libvirt.host [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.505 250710 DEBUG nova.virt.libvirt.host [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.505 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.505 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-01T09:36:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dff9230f-1656-4ee2-9f6d-710f2458058e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.506 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.506 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.506 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.506 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.507 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.507 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.507 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.507 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.507 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.508 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.533 250710 DEBUG nova.storage.rbd_utils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image 6740b382-574d-4ced-a156-11a531b94114_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.538 250710 DEBUG nova.privsep.utils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.539 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Dec  1 04:37:22 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3560338767' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 04:37:22 np0005540741 podman[256764]: 2025-12-01 09:37:22.976156283 +0000 UTC m=+0.072650241 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec  1 04:37:22 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.997 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:23 np0005540741 nova_compute[250706]: 2025-12-01 09:37:22.999 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:23 np0005540741 nova_compute[250706]: 2025-12-01 09:37:23.001 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:23 np0005540741 nova_compute[250706]: 2025-12-01 09:37:23.004 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:37:23 np0005540741 systemd[1]: Starting libvirt secret daemon...
Dec  1 04:37:23 np0005540741 systemd[1]: Started libvirt secret daemon.
Dec  1 04:37:23 np0005540741 nova_compute[250706]: 2025-12-01 09:37:23.107 250710 DEBUG nova.objects.instance [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lazy-loading 'pci_devices' on Instance uuid 6740b382-574d-4ced-a156-11a531b94114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 04:37:23 np0005540741 nova_compute[250706]: 2025-12-01 09:37:23.131 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] End _get_guest_xml xml=<domain type="kvm">
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  <uuid>6740b382-574d-4ced-a156-11a531b94114</uuid>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  <name>instance-00000001</name>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  <memory>131072</memory>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  <vcpu>1</vcpu>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  <metadata>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <nova:name>instance-depend-image</nova:name>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <nova:creationTime>2025-12-01 09:37:22</nova:creationTime>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <nova:flavor name="m1.nano">
Dec  1 04:37:23 np0005540741 nova_compute[250706]:        <nova:memory>128</nova:memory>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:        <nova:disk>1</nova:disk>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:        <nova:swap>0</nova:swap>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:        <nova:ephemeral>0</nova:ephemeral>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:        <nova:vcpus>1</nova:vcpus>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      </nova:flavor>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <nova:owner>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:        <nova:user uuid="14165de8e6af473c94a109257a29c50c">tempest-ImageDependencyTests-805054756-project-member</nova:user>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:        <nova:project uuid="8a9d236048d24c39893cd69ad598bc1a">tempest-ImageDependencyTests-805054756</nova:project>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      </nova:owner>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <nova:ports/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    </nova:instance>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  </metadata>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  <sysinfo type="smbios">
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <system>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <entry name="manufacturer">RDO</entry>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <entry name="product">OpenStack Compute</entry>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <entry name="serial">6740b382-574d-4ced-a156-11a531b94114</entry>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <entry name="uuid">6740b382-574d-4ced-a156-11a531b94114</entry>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <entry name="family">Virtual Machine</entry>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    </system>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  </sysinfo>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  <os>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <boot dev="hd"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <smbios mode="sysinfo"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  </os>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  <features>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <acpi/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <apic/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <vmcoreinfo/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  </features>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  <clock offset="utc">
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <timer name="pit" tickpolicy="delay"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <timer name="hpet" present="no"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  </clock>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  <cpu mode="host-model" match="exact">
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <topology sockets="1" cores="1" threads="1"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  </cpu>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  <devices>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <disk type="network" device="cdrom">
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <driver type="raw" cache="none"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <source protocol="rbd" name="vms/6740b382-574d-4ced-a156-11a531b94114_disk.config">
Dec  1 04:37:23 np0005540741 nova_compute[250706]:        <host name="192.168.122.100" port="6789"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      </source>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <auth username="openstack">
Dec  1 04:37:23 np0005540741 nova_compute[250706]:        <secret type="ceph" uuid="5620a9fb-e540-5250-a0e8-7aaad5347e3b"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      </auth>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <target dev="sda" bus="sata"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    </disk>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <disk type="network" device="disk">
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <source protocol="rbd" name="volumes/volume-61c9bc29-5cbf-4816-a0ae-b24ddf88776c">
Dec  1 04:37:23 np0005540741 nova_compute[250706]:        <host name="192.168.122.100" port="6789"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      </source>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <auth username="openstack">
Dec  1 04:37:23 np0005540741 nova_compute[250706]:        <secret type="ceph" uuid="5620a9fb-e540-5250-a0e8-7aaad5347e3b"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      </auth>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <target dev="vda" bus="virtio"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <serial>61c9bc29-5cbf-4816-a0ae-b24ddf88776c</serial>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    </disk>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <serial type="pty">
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <log file="/var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/console.log" append="off"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    </serial>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <video>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <model type="virtio"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    </video>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <input type="tablet" bus="usb"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <rng model="virtio">
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <backend model="random">/dev/urandom</backend>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    </rng>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <controller type="usb" index="0"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    <memballoon model="virtio">
Dec  1 04:37:23 np0005540741 nova_compute[250706]:      <stats period="10"/>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:    </memballoon>
Dec  1 04:37:23 np0005540741 nova_compute[250706]:  </devices>
Dec  1 04:37:23 np0005540741 nova_compute[250706]: </domain>
Dec  1 04:37:23 np0005540741 nova_compute[250706]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  1 04:37:23 np0005540741 nova_compute[250706]: 2025-12-01 09:37:23.188 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 04:37:23 np0005540741 nova_compute[250706]: 2025-12-01 09:37:23.188 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 04:37:23 np0005540741 nova_compute[250706]: 2025-12-01 09:37:23.189 250710 INFO nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Using config drive#033[00m
Dec  1 04:37:23 np0005540741 nova_compute[250706]: 2025-12-01 09:37:23.217 250710 DEBUG nova.storage.rbd_utils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image 6740b382-574d-4ced-a156-11a531b94114_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 04:37:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v787: 193 pgs: 2 active+clean+snaptrim, 8 active+clean+snaptrim_wait, 183 active+clean; 41 MiB data, 148 MiB used, 60 GiB / 60 GiB avail; 166 KiB/s rd, 15 KiB/s wr, 230 op/s
Dec  1 04:37:23 np0005540741 nova_compute[250706]: 2025-12-01 09:37:23.837 250710 INFO nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Creating config drive at /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/disk.config#033[00m
Dec  1 04:37:23 np0005540741 nova_compute[250706]: 2025-12-01 09:37:23.842 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2l2wouwm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:23 np0005540741 nova_compute[250706]: 2025-12-01 09:37:23.973 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2l2wouwm" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:23 np0005540741 nova_compute[250706]: 2025-12-01 09:37:23.998 250710 DEBUG nova.storage.rbd_utils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image 6740b382-574d-4ced-a156-11a531b94114_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 04:37:24 np0005540741 nova_compute[250706]: 2025-12-01 09:37:24.002 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/disk.config 6740b382-574d-4ced-a156-11a531b94114_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e71 do_prune osdmap full prune enabled
Dec  1 04:37:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e72 e72: 3 total, 3 up, 3 in
Dec  1 04:37:24 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e72: 3 total, 3 up, 3 in
Dec  1 04:37:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:37:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e72 do_prune osdmap full prune enabled
Dec  1 04:37:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e73 e73: 3 total, 3 up, 3 in
Dec  1 04:37:25 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e73: 3 total, 3 up, 3 in
Dec  1 04:37:25 np0005540741 nova_compute[250706]: 2025-12-01 09:37:25.335 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/disk.config 6740b382-574d-4ced-a156-11a531b94114_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.333s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:25 np0005540741 nova_compute[250706]: 2025-12-01 09:37:25.336 250710 INFO nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Deleting local config drive /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/disk.config because it was imported into RBD.#033[00m
Dec  1 04:37:25 np0005540741 systemd-machined[212908]: New machine qemu-1-instance-00000001.
Dec  1 04:37:25 np0005540741 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Dec  1 04:37:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v790: 193 pgs: 193 active+clean; 41 MiB data, 148 MiB used, 60 GiB / 60 GiB avail; 176 KiB/s rd, 16 KiB/s wr, 244 op/s
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.210 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.213 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.214 250710 DEBUG nova.virt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Emitting event <LifecycleEvent: 1764581846.211662, 6740b382-574d-4ced-a156-11a531b94114 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.214 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] VM Resumed (Lifecycle Event)#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.223 250710 INFO nova.virt.libvirt.driver [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] Instance spawned successfully.#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.224 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.296 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.300 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.314 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.314 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.315 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.315 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.316 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.316 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.322 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.322 250710 DEBUG nova.virt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Emitting event <LifecycleEvent: 1764581846.2117918, 6740b382-574d-4ced-a156-11a531b94114 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.322 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] VM Started (Lifecycle Event)#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.379 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.383 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.388 250710 INFO nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Took 3.90 seconds to spawn the instance on the hypervisor.#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.389 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.405 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.469 250710 INFO nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Took 7.82 seconds to build instance.#033[00m
Dec  1 04:37:26 np0005540741 nova_compute[250706]: 2025-12-01 09:37:26.491 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "6740b382-574d-4ced-a156-11a531b94114" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.931s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:37:27 np0005540741 nova_compute[250706]: 2025-12-01 09:37:27.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:37:27 np0005540741 nova_compute[250706]: 2025-12-01 09:37:27.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  1 04:37:27 np0005540741 nova_compute[250706]: 2025-12-01 09:37:27.073 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  1 04:37:27 np0005540741 nova_compute[250706]: 2025-12-01 09:37:27.074 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:37:27 np0005540741 nova_compute[250706]: 2025-12-01 09:37:27.075 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  1 04:37:27 np0005540741 nova_compute[250706]: 2025-12-01 09:37:27.093 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:37:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v791: 193 pgs: 193 active+clean; 41 MiB data, 148 MiB used, 60 GiB / 60 GiB avail; 125 KiB/s rd, 28 KiB/s wr, 170 op/s
Dec  1 04:37:29 np0005540741 nova_compute[250706]: 2025-12-01 09:37:29.118 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:37:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e73 do_prune osdmap full prune enabled
Dec  1 04:37:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e74 e74: 3 total, 3 up, 3 in
Dec  1 04:37:29 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e74: 3 total, 3 up, 3 in
Dec  1 04:37:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v793: 193 pgs: 193 active+clean; 41 MiB data, 148 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 25 KiB/s wr, 31 op/s
Dec  1 04:37:30 np0005540741 nova_compute[250706]: 2025-12-01 09:37:30.048 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:37:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e74 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:37:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e74 do_prune osdmap full prune enabled
Dec  1 04:37:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e75 e75: 3 total, 3 up, 3 in
Dec  1 04:37:30 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e75: 3 total, 3 up, 3 in
Dec  1 04:37:30 np0005540741 podman[256920]: 2025-12-01 09:37:30.398441771 +0000 UTC m=+0.150803401 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  1 04:37:31 np0005540741 nova_compute[250706]: 2025-12-01 09:37:31.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:37:31 np0005540741 nova_compute[250706]: 2025-12-01 09:37:31.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 04:37:31 np0005540741 nova_compute[250706]: 2025-12-01 09:37:31.054 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 04:37:31 np0005540741 nova_compute[250706]: 2025-12-01 09:37:31.571 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "refresh_cache-6740b382-574d-4ced-a156-11a531b94114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 04:37:31 np0005540741 nova_compute[250706]: 2025-12-01 09:37:31.572 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquired lock "refresh_cache-6740b382-574d-4ced-a156-11a531b94114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 04:37:31 np0005540741 nova_compute[250706]: 2025-12-01 09:37:31.572 250710 DEBUG nova.network.neutron [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  1 04:37:31 np0005540741 nova_compute[250706]: 2025-12-01 09:37:31.573 250710 DEBUG nova.objects.instance [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6740b382-574d-4ced-a156-11a531b94114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 04:37:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v795: 193 pgs: 193 active+clean; 41 MiB data, 166 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 25 KiB/s wr, 48 op/s
Dec  1 04:37:31 np0005540741 nova_compute[250706]: 2025-12-01 09:37:31.903 250710 DEBUG nova.network.neutron [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.200 250710 DEBUG nova.network.neutron [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.221 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Releasing lock "refresh_cache-6740b382-574d-4ced-a156-11a531b94114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.222 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.223 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.223 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.223 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.224 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.224 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.224 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.252 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.253 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.253 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.254 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.254 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec  1 04:37:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1522047460' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.730 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e75 do_prune osdmap full prune enabled
Dec  1 04:37:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e76 e76: 3 total, 3 up, 3 in
Dec  1 04:37:32 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e76: 3 total, 3 up, 3 in
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.836 250710 DEBUG nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.836 250710 DEBUG nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.985 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.986 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5209MB free_disk=59.98813247680664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.986 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:32 np0005540741 nova_compute[250706]: 2025-12-01 09:37:32.986 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:33 np0005540741 nova_compute[250706]: 2025-12-01 09:37:33.250 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Instance 6740b382-574d-4ced-a156-11a531b94114 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  1 04:37:33 np0005540741 nova_compute[250706]: 2025-12-01 09:37:33.251 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 04:37:33 np0005540741 nova_compute[250706]: 2025-12-01 09:37:33.251 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 04:37:33 np0005540741 nova_compute[250706]: 2025-12-01 09:37:33.366 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Refreshing inventories for resource provider 847e3dbe-0f76-4032-a374-8c965945c22f _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  1 04:37:33 np0005540741 nova_compute[250706]: 2025-12-01 09:37:33.459 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Updating ProviderTree inventory for provider 847e3dbe-0f76-4032-a374-8c965945c22f from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  1 04:37:33 np0005540741 nova_compute[250706]: 2025-12-01 09:37:33.460 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Updating inventory in ProviderTree for provider 847e3dbe-0f76-4032-a374-8c965945c22f with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 04:37:33 np0005540741 nova_compute[250706]: 2025-12-01 09:37:33.479 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Refreshing aggregate associations for resource provider 847e3dbe-0f76-4032-a374-8c965945c22f, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  1 04:37:33 np0005540741 nova_compute[250706]: 2025-12-01 09:37:33.504 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Refreshing trait associations for resource provider 847e3dbe-0f76-4032-a374-8c965945c22f, traits: COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE4A,HW_CPU_X86_AESNI,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_F16C,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_MMX,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_BMI2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  1 04:37:33 np0005540741 nova_compute[250706]: 2025-12-01 09:37:33.550 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v797: 193 pgs: 193 active+clean; 41 MiB data, 166 MiB used, 60 GiB / 60 GiB avail; 70 KiB/s rd, 3.3 KiB/s wr, 90 op/s
Dec  1 04:37:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e76 do_prune osdmap full prune enabled
Dec  1 04:37:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e77 e77: 3 total, 3 up, 3 in
Dec  1 04:37:33 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e77: 3 total, 3 up, 3 in
Dec  1 04:37:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec  1 04:37:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/812207976' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 04:37:34 np0005540741 nova_compute[250706]: 2025-12-01 09:37:34.036 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:34 np0005540741 nova_compute[250706]: 2025-12-01 09:37:34.042 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 04:37:34 np0005540741 nova_compute[250706]: 2025-12-01 09:37:34.067 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 04:37:34 np0005540741 nova_compute[250706]: 2025-12-01 09:37:34.092 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 04:37:34 np0005540741 nova_compute[250706]: 2025-12-01 09:37:34.093 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:37:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e77 do_prune osdmap full prune enabled
Dec  1 04:37:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e78 e78: 3 total, 3 up, 3 in
Dec  1 04:37:34 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e78: 3 total, 3 up, 3 in
Dec  1 04:37:34 np0005540741 nova_compute[250706]: 2025-12-01 09:37:34.922 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:37:35 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e78 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:37:35 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e78 do_prune osdmap full prune enabled
Dec  1 04:37:35 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e79 e79: 3 total, 3 up, 3 in
Dec  1 04:37:35 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e79: 3 total, 3 up, 3 in
Dec  1 04:37:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v801: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 134 KiB/s rd, 8.5 KiB/s wr, 180 op/s
Dec  1 04:37:35 np0005540741 podman[256992]: 2025-12-01 09:37:35.979139107 +0000 UTC m=+0.076701768 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  1 04:37:36 np0005540741 nova_compute[250706]: 2025-12-01 09:37:36.949 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:36 np0005540741 nova_compute[250706]: 2025-12-01 09:37:36.950 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:36 np0005540741 nova_compute[250706]: 2025-12-01 09:37:36.964 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.030 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.031 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.039 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.039 250710 INFO nova.compute.claims [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.169 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:37 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec  1 04:37:37 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1234859688' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.629 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.636 250710 DEBUG nova.compute.provider_tree [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.659 250710 DEBUG nova.scheduler.client.report [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.687 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:37:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v802: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 74 KiB/s rd, 6.5 KiB/s wr, 102 op/s
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.688 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.751 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.751 250710 DEBUG nova.network.neutron [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.782 250710 INFO nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.803 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.908 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.910 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.911 250710 INFO nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Creating image(s)#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.945 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.971 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 04:37:37 np0005540741 nova_compute[250706]: 2025-12-01 09:37:37.998 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.001 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "7d2050fd4f341e6a47ec44656714d34127018d9a" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.002 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "7d2050fd4f341e6a47ec44656714d34127018d9a" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.353 250710 DEBUG nova.network.neutron [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.353 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.358 250710 DEBUG nova.virt.libvirt.imagebackend [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Image locations are: [{'url': 'rbd://5620a9fb-e540-5250-a0e8-7aaad5347e3b/images/44751503-6174-45f0-a7ed-07cbb763b067/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://5620a9fb-e540-5250-a0e8-7aaad5347e3b/images/44751503-6174-45f0-a7ed-07cbb763b067/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.428 250710 DEBUG nova.virt.libvirt.imagebackend [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Selected location: {'url': 'rbd://5620a9fb-e540-5250-a0e8-7aaad5347e3b/images/44751503-6174-45f0-a7ed-07cbb763b067/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.428 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] cloning images/44751503-6174-45f0-a7ed-07cbb763b067@snap to None/beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.608 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "7d2050fd4f341e6a47ec44656714d34127018d9a" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.792 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] resizing rbd image beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.897 250710 DEBUG nova.objects.instance [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lazy-loading 'migration_context' on Instance uuid beb3fd59-b728-4e62-bc14-b171eebe8ee3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.918 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.918 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Ensure instance console log exists: /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.919 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.919 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.919 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.921 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='75f3695eaf1b320ff6b5ece01c175a54',container_format='bare',created_at=2025-12-01T09:37:34Z,direct_url=<?>,disk_format='raw',id=44751503-6174-45f0-a7ed-07cbb763b067,min_disk=0,min_ram=0,name='tempest-image-dependency-test-1690092867',owner='8a9d236048d24c39893cd69ad598bc1a',properties=ImageMetaProps,protected=<?>,size=1024,status='active',tags=<?>,updated_at=2025-12-01T09:37:34Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'device_name': '/dev/vda', 'image_id': '44751503-6174-45f0-a7ed-07cbb763b067'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.927 250710 WARNING nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.936 250710 DEBUG nova.virt.libvirt.host [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.937 250710 DEBUG nova.virt.libvirt.host [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.940 250710 DEBUG nova.virt.libvirt.host [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.940 250710 DEBUG nova.virt.libvirt.host [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.941 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.941 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-01T09:36:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dff9230f-1656-4ee2-9f6d-710f2458058e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='75f3695eaf1b320ff6b5ece01c175a54',container_format='bare',created_at=2025-12-01T09:37:34Z,direct_url=<?>,disk_format='raw',id=44751503-6174-45f0-a7ed-07cbb763b067,min_disk=0,min_ram=0,name='tempest-image-dependency-test-1690092867',owner='8a9d236048d24c39893cd69ad598bc1a',properties=ImageMetaProps,protected=<?>,size=1024,status='active',tags=<?>,updated_at=2025-12-01T09:37:34Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.941 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.942 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.942 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.942 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.942 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.942 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.943 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.943 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.943 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.943 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  1 04:37:38 np0005540741 nova_compute[250706]: 2025-12-01 09:37:38.945 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Dec  1 04:37:39 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3000882322' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 04:37:39 np0005540741 nova_compute[250706]: 2025-12-01 09:37:39.397 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:39 np0005540741 nova_compute[250706]: 2025-12-01 09:37:39.429 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 04:37:39 np0005540741 nova_compute[250706]: 2025-12-01 09:37:39.434 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v803: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 61 KiB/s rd, 5.3 KiB/s wr, 84 op/s
Dec  1 04:37:39 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Dec  1 04:37:39 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/166926032' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 04:37:39 np0005540741 nova_compute[250706]: 2025-12-01 09:37:39.849 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:39 np0005540741 nova_compute[250706]: 2025-12-01 09:37:39.851 250710 DEBUG nova.objects.instance [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lazy-loading 'pci_devices' on Instance uuid beb3fd59-b728-4e62-bc14-b171eebe8ee3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 04:37:39 np0005540741 nova_compute[250706]: 2025-12-01 09:37:39.880 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] End _get_guest_xml xml=<domain type="kvm">
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  <uuid>beb3fd59-b728-4e62-bc14-b171eebe8ee3</uuid>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  <name>instance-00000002</name>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  <memory>131072</memory>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  <vcpu>1</vcpu>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  <metadata>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <nova:name>instance-depend-image</nova:name>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <nova:creationTime>2025-12-01 09:37:38</nova:creationTime>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <nova:flavor name="m1.nano">
Dec  1 04:37:39 np0005540741 nova_compute[250706]:        <nova:memory>128</nova:memory>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:        <nova:disk>1</nova:disk>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:        <nova:swap>0</nova:swap>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:        <nova:ephemeral>0</nova:ephemeral>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:        <nova:vcpus>1</nova:vcpus>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      </nova:flavor>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <nova:owner>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:        <nova:user uuid="14165de8e6af473c94a109257a29c50c">tempest-ImageDependencyTests-805054756-project-member</nova:user>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:        <nova:project uuid="8a9d236048d24c39893cd69ad598bc1a">tempest-ImageDependencyTests-805054756</nova:project>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      </nova:owner>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <nova:root type="image" uuid="44751503-6174-45f0-a7ed-07cbb763b067"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <nova:ports/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    </nova:instance>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  </metadata>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  <sysinfo type="smbios">
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <system>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <entry name="manufacturer">RDO</entry>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <entry name="product">OpenStack Compute</entry>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <entry name="serial">beb3fd59-b728-4e62-bc14-b171eebe8ee3</entry>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <entry name="uuid">beb3fd59-b728-4e62-bc14-b171eebe8ee3</entry>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <entry name="family">Virtual Machine</entry>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    </system>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  </sysinfo>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  <os>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <boot dev="hd"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <smbios mode="sysinfo"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  </os>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  <features>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <acpi/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <apic/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <vmcoreinfo/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  </features>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  <clock offset="utc">
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <timer name="pit" tickpolicy="delay"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <timer name="hpet" present="no"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  </clock>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  <cpu mode="host-model" match="exact">
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <topology sockets="1" cores="1" threads="1"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  </cpu>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  <devices>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <disk type="network" device="disk">
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <driver type="raw" cache="none"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <source protocol="rbd" name="vms/beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk">
Dec  1 04:37:39 np0005540741 nova_compute[250706]:        <host name="192.168.122.100" port="6789"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      </source>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <auth username="openstack">
Dec  1 04:37:39 np0005540741 nova_compute[250706]:        <secret type="ceph" uuid="5620a9fb-e540-5250-a0e8-7aaad5347e3b"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      </auth>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <target dev="vda" bus="virtio"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    </disk>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <disk type="network" device="cdrom">
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <driver type="raw" cache="none"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <source protocol="rbd" name="vms/beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk.config">
Dec  1 04:37:39 np0005540741 nova_compute[250706]:        <host name="192.168.122.100" port="6789"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      </source>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <auth username="openstack">
Dec  1 04:37:39 np0005540741 nova_compute[250706]:        <secret type="ceph" uuid="5620a9fb-e540-5250-a0e8-7aaad5347e3b"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      </auth>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <target dev="sda" bus="sata"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    </disk>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <serial type="pty">
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <log file="/var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/console.log" append="off"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    </serial>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <video>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <model type="virtio"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    </video>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <input type="tablet" bus="usb"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <rng model="virtio">
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <backend model="random">/dev/urandom</backend>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    </rng>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <controller type="usb" index="0"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    <memballoon model="virtio">
Dec  1 04:37:39 np0005540741 nova_compute[250706]:      <stats period="10"/>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:    </memballoon>
Dec  1 04:37:39 np0005540741 nova_compute[250706]:  </devices>
Dec  1 04:37:39 np0005540741 nova_compute[250706]: </domain>
Dec  1 04:37:39 np0005540741 nova_compute[250706]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  1 04:37:39 np0005540741 nova_compute[250706]: 2025-12-01 09:37:39.921 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 04:37:39 np0005540741 nova_compute[250706]: 2025-12-01 09:37:39.922 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 04:37:39 np0005540741 nova_compute[250706]: 2025-12-01 09:37:39.923 250710 INFO nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Using config drive#033[00m
Dec  1 04:37:39 np0005540741 nova_compute[250706]: 2025-12-01 09:37:39.950 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 04:37:40 np0005540741 nova_compute[250706]: 2025-12-01 09:37:40.175 250710 INFO nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Creating config drive at /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/disk.config#033[00m
Dec  1 04:37:40 np0005540741 nova_compute[250706]: 2025-12-01 09:37:40.180 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj0ymohq_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e79 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:37:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e79 do_prune osdmap full prune enabled
Dec  1 04:37:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e80 e80: 3 total, 3 up, 3 in
Dec  1 04:37:40 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e80: 3 total, 3 up, 3 in
Dec  1 04:37:40 np0005540741 nova_compute[250706]: 2025-12-01 09:37:40.318 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj0ymohq_" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:40 np0005540741 nova_compute[250706]: 2025-12-01 09:37:40.366 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 04:37:40 np0005540741 nova_compute[250706]: 2025-12-01 09:37:40.371 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/disk.config beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:40 np0005540741 nova_compute[250706]: 2025-12-01 09:37:40.545 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/disk.config beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.174s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:40 np0005540741 nova_compute[250706]: 2025-12-01 09:37:40.547 250710 INFO nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Deleting local config drive /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/disk.config because it was imported into RBD.#033[00m
Dec  1 04:37:40 np0005540741 systemd-machined[212908]: New machine qemu-2-instance-00000002.
Dec  1 04:37:40 np0005540741 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.398 250710 DEBUG nova.virt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Emitting event <LifecycleEvent: 1764581861.3982816, beb3fd59-b728-4e62-bc14-b171eebe8ee3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.399 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] VM Resumed (Lifecycle Event)#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.402 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.402 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.406 250710 INFO nova.virt.libvirt.driver [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Instance spawned successfully.#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.406 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.428 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.434 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.436 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.437 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.437 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.437 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.438 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.438 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.474 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.474 250710 DEBUG nova.virt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Emitting event <LifecycleEvent: 1764581861.3993945, beb3fd59-b728-4e62-bc14-b171eebe8ee3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.475 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] VM Started (Lifecycle Event)#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.501 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.505 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.510 250710 INFO nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Took 3.60 seconds to spawn the instance on the hypervisor.#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.510 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.531 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.566 250710 INFO nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Took 4.56 seconds to build instance.#033[00m
Dec  1 04:37:41 np0005540741 nova_compute[250706]: 2025-12-01 09:37:41.582 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:37:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v805: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 82 KiB/s rd, 5.5 KiB/s wr, 109 op/s
Dec  1 04:37:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:37:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:37:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:37:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:37:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:37:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:37:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v806: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 82 KiB/s rd, 22 KiB/s wr, 107 op/s
Dec  1 04:37:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec  1 04:37:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1566923500' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 04:37:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec  1 04:37:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1566923500' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 04:37:44 np0005540741 nova_compute[250706]: 2025-12-01 09:37:44.921 250710 DEBUG nova.compute.manager [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 04:37:44 np0005540741 nova_compute[250706]: 2025-12-01 09:37:44.961 250710 INFO nova.compute.manager [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] instance snapshotting#033[00m
Dec  1 04:37:45 np0005540741 nova_compute[250706]: 2025-12-01 09:37:45.152 250710 INFO nova.virt.libvirt.driver [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Beginning live snapshot process#033[00m
Dec  1 04:37:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:37:45 np0005540741 nova_compute[250706]: 2025-12-01 09:37:45.355 250710 DEBUG nova.storage.rbd_utils [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] creating snapshot(0e245d27dde741ffa3dc5af7777eec0d) on rbd image(beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Dec  1 04:37:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v807: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 59 KiB/s rd, 17 KiB/s wr, 75 op/s
Dec  1 04:37:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e80 do_prune osdmap full prune enabled
Dec  1 04:37:46 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e81 e81: 3 total, 3 up, 3 in
Dec  1 04:37:46 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e81: 3 total, 3 up, 3 in
Dec  1 04:37:46 np0005540741 nova_compute[250706]: 2025-12-01 09:37:46.370 250710 DEBUG nova.storage.rbd_utils [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] cloning vms/beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk@0e245d27dde741ffa3dc5af7777eec0d to images/e7cfd47e-36b4-4753-ba43-b81de92dca95 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Dec  1 04:37:46 np0005540741 nova_compute[250706]: 2025-12-01 09:37:46.498 250710 DEBUG nova.storage.rbd_utils [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] flattening images/e7cfd47e-36b4-4753-ba43-b81de92dca95 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Dec  1 04:37:46 np0005540741 nova_compute[250706]: 2025-12-01 09:37:46.664 250710 DEBUG nova.storage.rbd_utils [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] removing snapshot(0e245d27dde741ffa3dc5af7777eec0d) on rbd image(beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Dec  1 04:37:47 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e81 do_prune osdmap full prune enabled
Dec  1 04:37:47 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e82 e82: 3 total, 3 up, 3 in
Dec  1 04:37:47 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e82: 3 total, 3 up, 3 in
Dec  1 04:37:47 np0005540741 nova_compute[250706]: 2025-12-01 09:37:47.357 250710 DEBUG nova.storage.rbd_utils [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] creating snapshot(snap) on rbd image(e7cfd47e-36b4-4753-ba43-b81de92dca95) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Dec  1 04:37:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v810: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 99 KiB/s rd, 24 KiB/s wr, 123 op/s
Dec  1 04:37:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e82 do_prune osdmap full prune enabled
Dec  1 04:37:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e83 e83: 3 total, 3 up, 3 in
Dec  1 04:37:48 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e83: 3 total, 3 up, 3 in
Dec  1 04:37:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v812: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 65 KiB/s rd, 3.8 KiB/s wr, 83 op/s
Dec  1 04:37:49 np0005540741 nova_compute[250706]: 2025-12-01 09:37:49.801 250710 INFO nova.virt.libvirt.driver [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Snapshot image upload complete#033[00m
Dec  1 04:37:49 np0005540741 nova_compute[250706]: 2025-12-01 09:37:49.801 250710 INFO nova.compute.manager [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Took 4.84 seconds to snapshot the instance on the hypervisor.#033[00m
Dec  1 04:37:50 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:37:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v813: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 82 KiB/s rd, 4.5 KiB/s wr, 106 op/s
Dec  1 04:37:52 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e83 do_prune osdmap full prune enabled
Dec  1 04:37:52 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e84 e84: 3 total, 3 up, 3 in
Dec  1 04:37:52 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e84: 3 total, 3 up, 3 in
Dec  1 04:37:53 np0005540741 nova_compute[250706]: 2025-12-01 09:37:53.133 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:53 np0005540741 nova_compute[250706]: 2025-12-01 09:37:53.133 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:53 np0005540741 nova_compute[250706]: 2025-12-01 09:37:53.133 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:53 np0005540741 nova_compute[250706]: 2025-12-01 09:37:53.134 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:53 np0005540741 nova_compute[250706]: 2025-12-01 09:37:53.134 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:37:53 np0005540741 nova_compute[250706]: 2025-12-01 09:37:53.135 250710 INFO nova.compute.manager [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Terminating instance#033[00m
Dec  1 04:37:53 np0005540741 nova_compute[250706]: 2025-12-01 09:37:53.136 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "refresh_cache-beb3fd59-b728-4e62-bc14-b171eebe8ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 04:37:53 np0005540741 nova_compute[250706]: 2025-12-01 09:37:53.137 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquired lock "refresh_cache-beb3fd59-b728-4e62-bc14-b171eebe8ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 04:37:53 np0005540741 nova_compute[250706]: 2025-12-01 09:37:53.137 250710 DEBUG nova.network.neutron [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  1 04:37:53 np0005540741 nova_compute[250706]: 2025-12-01 09:37:53.308 250710 DEBUG nova.network.neutron [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  1 04:37:53 np0005540741 nova_compute[250706]: 2025-12-01 09:37:53.641 250710 DEBUG nova.network.neutron [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 04:37:53 np0005540741 nova_compute[250706]: 2025-12-01 09:37:53.656 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Releasing lock "refresh_cache-beb3fd59-b728-4e62-bc14-b171eebe8ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 04:37:53 np0005540741 nova_compute[250706]: 2025-12-01 09:37:53.657 250710 DEBUG nova.compute.manager [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  1 04:37:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v815: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 115 KiB/s rd, 5.6 KiB/s wr, 147 op/s
Dec  1 04:37:53 np0005540741 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Dec  1 04:37:53 np0005540741 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 1.318s CPU time.
Dec  1 04:37:53 np0005540741 systemd-machined[212908]: Machine qemu-2-instance-00000002 terminated.
Dec  1 04:37:53 np0005540741 podman[257546]: 2025-12-01 09:37:53.797193986 +0000 UTC m=+0.066427482 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  1 04:37:53 np0005540741 nova_compute[250706]: 2025-12-01 09:37:53.883 250710 INFO nova.virt.libvirt.driver [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Instance destroyed successfully.#033[00m
Dec  1 04:37:53 np0005540741 nova_compute[250706]: 2025-12-01 09:37:53.884 250710 DEBUG nova.objects.instance [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lazy-loading 'resources' on Instance uuid beb3fd59-b728-4e62-bc14-b171eebe8ee3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 04:37:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:37:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e84 do_prune osdmap full prune enabled
Dec  1 04:37:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e85 e85: 3 total, 3 up, 3 in
Dec  1 04:37:55 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e85: 3 total, 3 up, 3 in
Dec  1 04:37:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v817: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 88 KiB/s rd, 4.1 KiB/s wr, 114 op/s
Dec  1 04:37:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e85 do_prune osdmap full prune enabled
Dec  1 04:37:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e86 e86: 3 total, 3 up, 3 in
Dec  1 04:37:56 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e86: 3 total, 3 up, 3 in
Dec  1 04:37:56 np0005540741 nova_compute[250706]: 2025-12-01 09:37:56.584 250710 INFO nova.virt.libvirt.driver [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Deleting instance files /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3_del#033[00m
Dec  1 04:37:56 np0005540741 nova_compute[250706]: 2025-12-01 09:37:56.584 250710 INFO nova.virt.libvirt.driver [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Deletion of /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3_del complete#033[00m
Dec  1 04:37:56 np0005540741 nova_compute[250706]: 2025-12-01 09:37:56.742 250710 DEBUG nova.virt.libvirt.host [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Dec  1 04:37:56 np0005540741 nova_compute[250706]: 2025-12-01 09:37:56.743 250710 INFO nova.virt.libvirt.host [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] UEFI support detected#033[00m
Dec  1 04:37:56 np0005540741 nova_compute[250706]: 2025-12-01 09:37:56.745 250710 INFO nova.compute.manager [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Took 3.09 seconds to destroy the instance on the hypervisor.#033[00m
Dec  1 04:37:56 np0005540741 nova_compute[250706]: 2025-12-01 09:37:56.746 250710 DEBUG oslo.service.loopingcall [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  1 04:37:56 np0005540741 nova_compute[250706]: 2025-12-01 09:37:56.746 250710 DEBUG nova.compute.manager [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  1 04:37:56 np0005540741 nova_compute[250706]: 2025-12-01 09:37:56.746 250710 DEBUG nova.network.neutron [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  1 04:37:57 np0005540741 nova_compute[250706]: 2025-12-01 09:37:57.017 250710 DEBUG nova.network.neutron [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  1 04:37:57 np0005540741 nova_compute[250706]: 2025-12-01 09:37:57.032 250710 DEBUG nova.network.neutron [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 04:37:57 np0005540741 nova_compute[250706]: 2025-12-01 09:37:57.044 250710 INFO nova.compute.manager [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Took 0.30 seconds to deallocate network for instance.#033[00m
Dec  1 04:37:57 np0005540741 nova_compute[250706]: 2025-12-01 09:37:57.092 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:57 np0005540741 nova_compute[250706]: 2025-12-01 09:37:57.093 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:57 np0005540741 nova_compute[250706]: 2025-12-01 09:37:57.163 250710 DEBUG oslo_concurrency.processutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:37:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec  1 04:37:57 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2334927344' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 04:37:57 np0005540741 nova_compute[250706]: 2025-12-01 09:37:57.648 250710 DEBUG oslo_concurrency.processutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:37:57 np0005540741 nova_compute[250706]: 2025-12-01 09:37:57.654 250710 DEBUG nova.compute.provider_tree [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 04:37:57 np0005540741 nova_compute[250706]: 2025-12-01 09:37:57.682 250710 DEBUG nova.scheduler.client.report [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 04:37:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v819: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 100 KiB/s rd, 5.5 KiB/s wr, 133 op/s
Dec  1 04:37:57 np0005540741 nova_compute[250706]: 2025-12-01 09:37:57.705 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.612s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:37:57 np0005540741 nova_compute[250706]: 2025-12-01 09:37:57.740 250710 INFO nova.scheduler.client.report [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Deleted allocations for instance beb3fd59-b728-4e62-bc14-b171eebe8ee3#033[00m
Dec  1 04:37:57 np0005540741 nova_compute[250706]: 2025-12-01 09:37:57.822 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:37:58 np0005540741 nova_compute[250706]: 2025-12-01 09:37:58.376 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "6740b382-574d-4ced-a156-11a531b94114" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:58 np0005540741 nova_compute[250706]: 2025-12-01 09:37:58.377 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "6740b382-574d-4ced-a156-11a531b94114" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:58 np0005540741 nova_compute[250706]: 2025-12-01 09:37:58.377 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "6740b382-574d-4ced-a156-11a531b94114-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:37:58 np0005540741 nova_compute[250706]: 2025-12-01 09:37:58.377 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "6740b382-574d-4ced-a156-11a531b94114-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:37:58 np0005540741 nova_compute[250706]: 2025-12-01 09:37:58.378 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "6740b382-574d-4ced-a156-11a531b94114-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:37:58 np0005540741 nova_compute[250706]: 2025-12-01 09:37:58.380 250710 INFO nova.compute.manager [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Terminating instance#033[00m
Dec  1 04:37:58 np0005540741 nova_compute[250706]: 2025-12-01 09:37:58.381 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "refresh_cache-6740b382-574d-4ced-a156-11a531b94114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 04:37:58 np0005540741 nova_compute[250706]: 2025-12-01 09:37:58.381 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquired lock "refresh_cache-6740b382-574d-4ced-a156-11a531b94114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 04:37:58 np0005540741 nova_compute[250706]: 2025-12-01 09:37:58.381 250710 DEBUG nova.network.neutron [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  1 04:37:59 np0005540741 nova_compute[250706]: 2025-12-01 09:37:59.023 250710 DEBUG nova.network.neutron [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  1 04:37:59 np0005540741 nova_compute[250706]: 2025-12-01 09:37:59.246 250710 DEBUG nova.network.neutron [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 04:37:59 np0005540741 nova_compute[250706]: 2025-12-01 09:37:59.264 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Releasing lock "refresh_cache-6740b382-574d-4ced-a156-11a531b94114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 04:37:59 np0005540741 nova_compute[250706]: 2025-12-01 09:37:59.265 250710 DEBUG nova.compute.manager [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  1 04:37:59 np0005540741 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Dec  1 04:37:59 np0005540741 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 1.321s CPU time.
Dec  1 04:37:59 np0005540741 systemd-machined[212908]: Machine qemu-1-instance-00000001 terminated.
Dec  1 04:37:59 np0005540741 nova_compute[250706]: 2025-12-01 09:37:59.489 250710 INFO nova.virt.libvirt.driver [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] Instance destroyed successfully.#033[00m
Dec  1 04:37:59 np0005540741 nova_compute[250706]: 2025-12-01 09:37:59.490 250710 DEBUG nova.objects.instance [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lazy-loading 'resources' on Instance uuid 6740b382-574d-4ced-a156-11a531b94114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 04:37:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v820: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 83 KiB/s rd, 4.6 KiB/s wr, 110 op/s
Dec  1 04:37:59 np0005540741 nova_compute[250706]: 2025-12-01 09:37:59.716 250710 INFO nova.virt.libvirt.driver [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Deleting instance files /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114_del#033[00m
Dec  1 04:37:59 np0005540741 nova_compute[250706]: 2025-12-01 09:37:59.717 250710 INFO nova.virt.libvirt.driver [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Deletion of /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114_del complete#033[00m
Dec  1 04:38:00 np0005540741 nova_compute[250706]: 2025-12-01 09:38:00.006 250710 INFO nova.compute.manager [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Took 0.74 seconds to destroy the instance on the hypervisor.#033[00m
Dec  1 04:38:00 np0005540741 nova_compute[250706]: 2025-12-01 09:38:00.007 250710 DEBUG oslo.service.loopingcall [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  1 04:38:00 np0005540741 nova_compute[250706]: 2025-12-01 09:38:00.007 250710 DEBUG nova.compute.manager [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  1 04:38:00 np0005540741 nova_compute[250706]: 2025-12-01 09:38:00.008 250710 DEBUG nova.network.neutron [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  1 04:38:00 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:38:00 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e86 do_prune osdmap full prune enabled
Dec  1 04:38:00 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e87 e87: 3 total, 3 up, 3 in
Dec  1 04:38:00 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e87: 3 total, 3 up, 3 in
Dec  1 04:38:01 np0005540741 podman[257634]: 2025-12-01 09:38:01.002677407 +0000 UTC m=+0.100901415 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 04:38:01 np0005540741 nova_compute[250706]: 2025-12-01 09:38:01.015 250710 DEBUG nova.network.neutron [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  1 04:38:01 np0005540741 nova_compute[250706]: 2025-12-01 09:38:01.042 250710 DEBUG nova.network.neutron [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 04:38:01 np0005540741 nova_compute[250706]: 2025-12-01 09:38:01.058 250710 INFO nova.compute.manager [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] Took 1.05 seconds to deallocate network for instance.#033[00m
Dec  1 04:38:01 np0005540741 nova_compute[250706]: 2025-12-01 09:38:01.347 250710 INFO nova.compute.manager [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Took 0.29 seconds to detach 1 volumes for instance.#033[00m
Dec  1 04:38:01 np0005540741 nova_compute[250706]: 2025-12-01 09:38:01.348 250710 DEBUG nova.compute.manager [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Deleting volume: 61c9bc29-5cbf-4816-a0ae-b24ddf88776c _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Dec  1 04:38:01 np0005540741 nova_compute[250706]: 2025-12-01 09:38:01.517 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:38:01 np0005540741 nova_compute[250706]: 2025-12-01 09:38:01.517 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:38:01 np0005540741 nova_compute[250706]: 2025-12-01 09:38:01.568 250710 DEBUG oslo_concurrency.processutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:38:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v822: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 78 KiB/s rd, 3.4 KiB/s wr, 104 op/s
Dec  1 04:38:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec  1 04:38:01 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1145948787' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 04:38:01 np0005540741 nova_compute[250706]: 2025-12-01 09:38:01.996 250710 DEBUG oslo_concurrency.processutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:38:02 np0005540741 nova_compute[250706]: 2025-12-01 09:38:02.002 250710 DEBUG nova.compute.provider_tree [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 04:38:02 np0005540741 nova_compute[250706]: 2025-12-01 09:38:02.018 250710 DEBUG nova.scheduler.client.report [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 04:38:02 np0005540741 nova_compute[250706]: 2025-12-01 09:38:02.037 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.520s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:38:02 np0005540741 nova_compute[250706]: 2025-12-01 09:38:02.069 250710 INFO nova.scheduler.client.report [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Deleted allocations for instance 6740b382-574d-4ced-a156-11a531b94114#033[00m
Dec  1 04:38:02 np0005540741 nova_compute[250706]: 2025-12-01 09:38:02.157 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "6740b382-574d-4ced-a156-11a531b94114" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.780s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:38:02 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e87 do_prune osdmap full prune enabled
Dec  1 04:38:02 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e88 e88: 3 total, 3 up, 3 in
Dec  1 04:38:02 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e88: 3 total, 3 up, 3 in
Dec  1 04:38:02 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec  1 04:38:02 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3430721002' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 04:38:02 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec  1 04:38:02 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3430721002' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 04:38:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v824: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 66 KiB/s rd, 3.4 KiB/s wr, 88 op/s
Dec  1 04:38:05 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:38:05.214 159899 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:9e:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '66:a0:73:58:3b:fd'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 04:38:05 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:38:05.215 159899 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 04:38:05 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:38:05 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e88 do_prune osdmap full prune enabled
Dec  1 04:38:05 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e89 e89: 3 total, 3 up, 3 in
Dec  1 04:38:05 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e89: 3 total, 3 up, 3 in
Dec  1 04:38:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v826: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 77 KiB/s rd, 3.0 KiB/s wr, 100 op/s
Dec  1 04:38:06 np0005540741 podman[257682]: 2025-12-01 09:38:06.960122605 +0000 UTC m=+0.061630684 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec  1 04:38:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v827: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 47 KiB/s rd, 1.9 KiB/s wr, 60 op/s
Dec  1 04:38:08 np0005540741 nova_compute[250706]: 2025-12-01 09:38:08.881 250710 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764581873.87877, beb3fd59-b728-4e62-bc14-b171eebe8ee3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 04:38:08 np0005540741 nova_compute[250706]: 2025-12-01 09:38:08.881 250710 INFO nova.compute.manager [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] VM Stopped (Lifecycle Event)#033[00m
Dec  1 04:38:08 np0005540741 nova_compute[250706]: 2025-12-01 09:38:08.904 250710 DEBUG nova.compute.manager [None req-91768fd3-3176-4f1a-8355-cb6b0dbe4f56 - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 04:38:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v828: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 1.2 KiB/s wr, 46 op/s
Dec  1 04:38:10 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:38:10 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e89 do_prune osdmap full prune enabled
Dec  1 04:38:10 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 e90: 3 total, 3 up, 3 in
Dec  1 04:38:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v829: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 1.1 KiB/s wr, 39 op/s
Dec  1 04:38:12 np0005540741 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e90: 3 total, 3 up, 3 in
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:38:13
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'vms', 'cephfs.cephfs.data', 'cephfs.cephfs.meta', 'images', 'volumes', '.mgr']
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:38:13 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:38:13.217 159899 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a8013a17-6378-4c2f-a5de-9d3b29c7a42e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:38:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v831: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 0 B/s wr, 16 op/s
Dec  1 04:38:14 np0005540741 nova_compute[250706]: 2025-12-01 09:38:14.487 250710 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764581879.4857285, 6740b382-574d-4ced-a156-11a531b94114 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 04:38:14 np0005540741 nova_compute[250706]: 2025-12-01 09:38:14.487 250710 INFO nova.compute.manager [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] VM Stopped (Lifecycle Event)#033[00m
Dec  1 04:38:14 np0005540741 nova_compute[250706]: 2025-12-01 09:38:14.511 250710 DEBUG nova.compute.manager [None req-ab855892-ca39-4092-916a-99c470d0237e - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 04:38:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:38:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v832: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v833: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:38:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:38:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:38:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:38:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec  1 04:38:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:38:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec  1 04:38:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:38:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:38:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:38:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec  1 04:38:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:38:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:38:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:38:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:38:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v834: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:38:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:38:20.476 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:38:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:38:20.477 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:38:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:38:20.477 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:38:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v835: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:38:21 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:38:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:38:21 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:38:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:38:21 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:38:21 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 45a19509-b60b-41ae-82a2-07fa38b73aa0 does not exist
Dec  1 04:38:21 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 2153b517-35de-4c09-b96b-713702e40265 does not exist
Dec  1 04:38:21 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev b2bec11f-8d77-4b04-99b2-bf9acac9e413 does not exist
Dec  1 04:38:21 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:38:21 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:38:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:38:22 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:38:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:38:22 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:38:22 np0005540741 podman[257974]: 2025-12-01 09:38:22.522513418 +0000 UTC m=+0.047592541 container create 579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_tu, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True)
Dec  1 04:38:22 np0005540741 systemd[1]: Started libpod-conmon-579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b.scope.
Dec  1 04:38:22 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:38:22 np0005540741 podman[257974]: 2025-12-01 09:38:22.501620777 +0000 UTC m=+0.026699960 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:38:22 np0005540741 podman[257974]: 2025-12-01 09:38:22.613564158 +0000 UTC m=+0.138643361 container init 579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_tu, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:38:22 np0005540741 podman[257974]: 2025-12-01 09:38:22.625493971 +0000 UTC m=+0.150573094 container start 579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_tu, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec  1 04:38:22 np0005540741 podman[257974]: 2025-12-01 09:38:22.628851258 +0000 UTC m=+0.153930421 container attach 579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_tu, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:38:22 np0005540741 festive_tu[257990]: 167 167
Dec  1 04:38:22 np0005540741 systemd[1]: libpod-579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b.scope: Deactivated successfully.
Dec  1 04:38:22 np0005540741 conmon[257990]: conmon 579ade348bfcece26a0b <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b.scope/container/memory.events
Dec  1 04:38:22 np0005540741 podman[257974]: 2025-12-01 09:38:22.633950395 +0000 UTC m=+0.159029548 container died 579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_tu, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:38:22 np0005540741 systemd[1]: var-lib-containers-storage-overlay-875d6f89812e96f143511a8046d625095791841c219678d56f90473a1a69ff5a-merged.mount: Deactivated successfully.
Dec  1 04:38:22 np0005540741 podman[257974]: 2025-12-01 09:38:22.681948366 +0000 UTC m=+0.207027519 container remove 579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_tu, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:38:22 np0005540741 systemd[1]: libpod-conmon-579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b.scope: Deactivated successfully.
Dec  1 04:38:22 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:38:22 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:38:22 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:38:22 np0005540741 podman[258013]: 2025-12-01 09:38:22.873558959 +0000 UTC m=+0.061516051 container create e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_edison, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:38:22 np0005540741 systemd[1]: Started libpod-conmon-e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc.scope.
Dec  1 04:38:22 np0005540741 podman[258013]: 2025-12-01 09:38:22.842243968 +0000 UTC m=+0.030201080 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:38:22 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:38:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c9957dbdf25e10d7611572038980a73c89e31aeadcf5277d253ab94582cb2be/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:38:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c9957dbdf25e10d7611572038980a73c89e31aeadcf5277d253ab94582cb2be/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:38:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c9957dbdf25e10d7611572038980a73c89e31aeadcf5277d253ab94582cb2be/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:38:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c9957dbdf25e10d7611572038980a73c89e31aeadcf5277d253ab94582cb2be/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:38:22 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c9957dbdf25e10d7611572038980a73c89e31aeadcf5277d253ab94582cb2be/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:38:22 np0005540741 podman[258013]: 2025-12-01 09:38:22.978947542 +0000 UTC m=+0.166904634 container init e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_edison, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Dec  1 04:38:22 np0005540741 podman[258013]: 2025-12-01 09:38:22.987531969 +0000 UTC m=+0.175489061 container start e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_edison, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:38:22 np0005540741 podman[258013]: 2025-12-01 09:38:22.991474572 +0000 UTC m=+0.179431674 container attach e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_edison, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec  1 04:38:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v836: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:23 np0005540741 podman[258043]: 2025-12-01 09:38:23.974107207 +0000 UTC m=+0.068376278 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 04:38:24 np0005540741 hopeful_edison[258030]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:38:24 np0005540741 hopeful_edison[258030]: --> relative data size: 1.0
Dec  1 04:38:24 np0005540741 hopeful_edison[258030]: --> All data devices are unavailable
Dec  1 04:38:24 np0005540741 systemd[1]: libpod-e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc.scope: Deactivated successfully.
Dec  1 04:38:24 np0005540741 systemd[1]: libpod-e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc.scope: Consumed 1.136s CPU time.
Dec  1 04:38:24 np0005540741 podman[258013]: 2025-12-01 09:38:24.166527324 +0000 UTC m=+1.354484396 container died e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_edison, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Dec  1 04:38:24 np0005540741 systemd[1]: var-lib-containers-storage-overlay-6c9957dbdf25e10d7611572038980a73c89e31aeadcf5277d253ab94582cb2be-merged.mount: Deactivated successfully.
Dec  1 04:38:24 np0005540741 podman[258013]: 2025-12-01 09:38:24.254115575 +0000 UTC m=+1.442072637 container remove e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_edison, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:38:24 np0005540741 systemd[1]: libpod-conmon-e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc.scope: Deactivated successfully.
Dec  1 04:38:24 np0005540741 podman[258236]: 2025-12-01 09:38:24.837202933 +0000 UTC m=+0.036281345 container create f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_lamarr, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:38:24 np0005540741 systemd[1]: Started libpod-conmon-f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8.scope.
Dec  1 04:38:24 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:38:24 np0005540741 podman[258236]: 2025-12-01 09:38:24.82318246 +0000 UTC m=+0.022260892 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:38:24 np0005540741 podman[258236]: 2025-12-01 09:38:24.921106418 +0000 UTC m=+0.120184870 container init f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_lamarr, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default)
Dec  1 04:38:24 np0005540741 podman[258236]: 2025-12-01 09:38:24.927152732 +0000 UTC m=+0.126231144 container start f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_lamarr, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec  1 04:38:24 np0005540741 podman[258236]: 2025-12-01 09:38:24.930067036 +0000 UTC m=+0.129145488 container attach f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_lamarr, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef)
Dec  1 04:38:24 np0005540741 busy_lamarr[258252]: 167 167
Dec  1 04:38:24 np0005540741 systemd[1]: libpod-f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8.scope: Deactivated successfully.
Dec  1 04:38:24 np0005540741 podman[258257]: 2025-12-01 09:38:24.981872846 +0000 UTC m=+0.031547068 container died f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_lamarr, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef)
Dec  1 04:38:25 np0005540741 systemd[1]: var-lib-containers-storage-overlay-4f2249abd3b67e0344220ab72cd22c89450dfe486f9b67d6d1c0cdb050ba3fac-merged.mount: Deactivated successfully.
Dec  1 04:38:25 np0005540741 podman[258257]: 2025-12-01 09:38:25.01882559 +0000 UTC m=+0.068499742 container remove f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_lamarr, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:38:25 np0005540741 systemd[1]: libpod-conmon-f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8.scope: Deactivated successfully.
Dec  1 04:38:25 np0005540741 podman[258279]: 2025-12-01 09:38:25.277488563 +0000 UTC m=+0.062001645 container create 05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_elion, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:38:25 np0005540741 systemd[1]: Started libpod-conmon-05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd.scope.
Dec  1 04:38:25 np0005540741 podman[258279]: 2025-12-01 09:38:25.253014499 +0000 UTC m=+0.037527591 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:38:25 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:38:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9fdedb1ac4d0d28a148667f2301870f5158812e506266e4df35834dbf857a3d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:38:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9fdedb1ac4d0d28a148667f2301870f5158812e506266e4df35834dbf857a3d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:38:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9fdedb1ac4d0d28a148667f2301870f5158812e506266e4df35834dbf857a3d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:38:25 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9fdedb1ac4d0d28a148667f2301870f5158812e506266e4df35834dbf857a3d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:38:25 np0005540741 podman[258279]: 2025-12-01 09:38:25.374931297 +0000 UTC m=+0.159444359 container init 05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_elion, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:38:25 np0005540741 podman[258279]: 2025-12-01 09:38:25.38927334 +0000 UTC m=+0.173786422 container start 05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_elion, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec  1 04:38:25 np0005540741 podman[258279]: 2025-12-01 09:38:25.394662935 +0000 UTC m=+0.179175997 container attach 05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_elion, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec  1 04:38:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:38:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v837: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]: {
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:    "0": [
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:        {
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "devices": [
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "/dev/loop3"
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            ],
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "lv_name": "ceph_lv0",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "lv_size": "21470642176",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "name": "ceph_lv0",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "tags": {
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.cluster_name": "ceph",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.crush_device_class": "",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.encrypted": "0",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.osd_id": "0",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.type": "block",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.vdo": "0"
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            },
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "type": "block",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "vg_name": "ceph_vg0"
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:        }
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:    ],
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:    "1": [
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:        {
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "devices": [
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "/dev/loop4"
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            ],
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "lv_name": "ceph_lv1",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "lv_size": "21470642176",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "name": "ceph_lv1",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "tags": {
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.cluster_name": "ceph",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.crush_device_class": "",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.encrypted": "0",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.osd_id": "1",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.type": "block",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.vdo": "0"
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            },
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "type": "block",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "vg_name": "ceph_vg1"
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:        }
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:    ],
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:    "2": [
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:        {
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "devices": [
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "/dev/loop5"
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            ],
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "lv_name": "ceph_lv2",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "lv_size": "21470642176",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "name": "ceph_lv2",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "tags": {
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.cluster_name": "ceph",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.crush_device_class": "",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.encrypted": "0",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.osd_id": "2",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.type": "block",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:                "ceph.vdo": "0"
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            },
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "type": "block",
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:            "vg_name": "ceph_vg2"
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:        }
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]:    ]
Dec  1 04:38:26 np0005540741 thirsty_elion[258296]: }
Dec  1 04:38:26 np0005540741 podman[258279]: 2025-12-01 09:38:26.207269788 +0000 UTC m=+0.991782840 container died 05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_elion, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec  1 04:38:26 np0005540741 systemd[1]: libpod-05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd.scope: Deactivated successfully.
Dec  1 04:38:26 np0005540741 systemd[1]: var-lib-containers-storage-overlay-b9fdedb1ac4d0d28a148667f2301870f5158812e506266e4df35834dbf857a3d-merged.mount: Deactivated successfully.
Dec  1 04:38:26 np0005540741 podman[258279]: 2025-12-01 09:38:26.389160922 +0000 UTC m=+1.173673964 container remove 05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_elion, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:38:26 np0005540741 systemd[1]: libpod-conmon-05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd.scope: Deactivated successfully.
Dec  1 04:38:27 np0005540741 podman[258461]: 2025-12-01 09:38:27.021503777 +0000 UTC m=+0.042807993 container create da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_mclean, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef)
Dec  1 04:38:27 np0005540741 systemd[1]: Started libpod-conmon-da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e.scope.
Dec  1 04:38:27 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:38:27 np0005540741 podman[258461]: 2025-12-01 09:38:27.000924114 +0000 UTC m=+0.022228330 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:38:27 np0005540741 podman[258461]: 2025-12-01 09:38:27.108755617 +0000 UTC m=+0.130059803 container init da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_mclean, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec  1 04:38:27 np0005540741 podman[258461]: 2025-12-01 09:38:27.114010819 +0000 UTC m=+0.135315005 container start da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_mclean, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec  1 04:38:27 np0005540741 podman[258461]: 2025-12-01 09:38:27.117093027 +0000 UTC m=+0.138397213 container attach da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_mclean, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:38:27 np0005540741 confident_mclean[258478]: 167 167
Dec  1 04:38:27 np0005540741 systemd[1]: libpod-da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e.scope: Deactivated successfully.
Dec  1 04:38:27 np0005540741 podman[258461]: 2025-12-01 09:38:27.123309046 +0000 UTC m=+0.144613242 container died da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_mclean, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec  1 04:38:27 np0005540741 systemd[1]: var-lib-containers-storage-overlay-758cce2a7a3502b74cc8441f35cf0713ee36b985abf2f369098f07c3aee90dbf-merged.mount: Deactivated successfully.
Dec  1 04:38:27 np0005540741 podman[258461]: 2025-12-01 09:38:27.162125523 +0000 UTC m=+0.183429729 container remove da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_mclean, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2)
Dec  1 04:38:27 np0005540741 systemd[1]: libpod-conmon-da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e.scope: Deactivated successfully.
Dec  1 04:38:27 np0005540741 podman[258500]: 2025-12-01 09:38:27.312663425 +0000 UTC m=+0.043912085 container create 0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_khorana, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  1 04:38:27 np0005540741 systemd[1]: Started libpod-conmon-0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71.scope.
Dec  1 04:38:27 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:38:27 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2980e476e6a020ddcd40acf0843a19254d23a08a466482728ececbcd9ffa3e49/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:38:27 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2980e476e6a020ddcd40acf0843a19254d23a08a466482728ececbcd9ffa3e49/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:38:27 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2980e476e6a020ddcd40acf0843a19254d23a08a466482728ececbcd9ffa3e49/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:38:27 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2980e476e6a020ddcd40acf0843a19254d23a08a466482728ececbcd9ffa3e49/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:38:27 np0005540741 podman[258500]: 2025-12-01 09:38:27.374313739 +0000 UTC m=+0.105562419 container init 0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_khorana, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:38:27 np0005540741 podman[258500]: 2025-12-01 09:38:27.384752469 +0000 UTC m=+0.116001119 container start 0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_khorana, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:38:27 np0005540741 podman[258500]: 2025-12-01 09:38:27.291758143 +0000 UTC m=+0.023006833 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:38:27 np0005540741 podman[258500]: 2025-12-01 09:38:27.387862319 +0000 UTC m=+0.119110999 container attach 0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_khorana, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:38:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v838: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]: {
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:        "osd_id": 0,
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:        "type": "bluestore"
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:    },
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:        "osd_id": 1,
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:        "type": "bluestore"
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:    },
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:        "osd_id": 2,
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:        "type": "bluestore"
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]:    }
Dec  1 04:38:28 np0005540741 fervent_khorana[258517]: }
Dec  1 04:38:28 np0005540741 systemd[1]: libpod-0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71.scope: Deactivated successfully.
Dec  1 04:38:28 np0005540741 systemd[1]: libpod-0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71.scope: Consumed 1.007s CPU time.
Dec  1 04:38:28 np0005540741 podman[258550]: 2025-12-01 09:38:28.427067672 +0000 UTC m=+0.023234579 container died 0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_khorana, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  1 04:38:28 np0005540741 systemd[1]: var-lib-containers-storage-overlay-2980e476e6a020ddcd40acf0843a19254d23a08a466482728ececbcd9ffa3e49-merged.mount: Deactivated successfully.
Dec  1 04:38:28 np0005540741 podman[258550]: 2025-12-01 09:38:28.47668456 +0000 UTC m=+0.072851447 container remove 0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_khorana, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Dec  1 04:38:28 np0005540741 systemd[1]: libpod-conmon-0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71.scope: Deactivated successfully.
Dec  1 04:38:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:38:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:38:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:38:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:38:28 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 3af4ca4f-049b-44c7-be1c-87cc86e7c9e4 does not exist
Dec  1 04:38:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:38:28 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:38:29 np0005540741 nova_compute[250706]: 2025-12-01 09:38:29.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:38:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v839: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:38:31 np0005540741 nova_compute[250706]: 2025-12-01 09:38:31.048 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:38:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v840: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:31 np0005540741 podman[258616]: 2025-12-01 09:38:31.989970325 +0000 UTC m=+0.094257713 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 04:38:32 np0005540741 nova_compute[250706]: 2025-12-01 09:38:32.054 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:38:32 np0005540741 nova_compute[250706]: 2025-12-01 09:38:32.055 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 04:38:32 np0005540741 nova_compute[250706]: 2025-12-01 09:38:32.055 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 04:38:32 np0005540741 nova_compute[250706]: 2025-12-01 09:38:32.103 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 04:38:32 np0005540741 nova_compute[250706]: 2025-12-01 09:38:32.104 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.051 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.052 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.083 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.084 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.084 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.084 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.085 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:38:33 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec  1 04:38:33 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3040711938' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.572 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:38:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v841: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.731 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.732 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5216MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.733 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.733 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.815 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.815 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 04:38:33 np0005540741 nova_compute[250706]: 2025-12-01 09:38:33.832 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:38:34 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec  1 04:38:34 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3455265482' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 04:38:34 np0005540741 nova_compute[250706]: 2025-12-01 09:38:34.247 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:38:34 np0005540741 nova_compute[250706]: 2025-12-01 09:38:34.253 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 04:38:34 np0005540741 nova_compute[250706]: 2025-12-01 09:38:34.280 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 04:38:34 np0005540741 nova_compute[250706]: 2025-12-01 09:38:34.309 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 04:38:34 np0005540741 nova_compute[250706]: 2025-12-01 09:38:34.310 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:38:35 np0005540741 nova_compute[250706]: 2025-12-01 09:38:35.306 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:38:35 np0005540741 nova_compute[250706]: 2025-12-01 09:38:35.323 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:38:35 np0005540741 nova_compute[250706]: 2025-12-01 09:38:35.323 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:38:35 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:38:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v842: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v843: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:37 np0005540741 podman[258688]: 2025-12-01 09:38:37.95741117 +0000 UTC m=+0.055197560 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  1 04:38:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v844: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:38:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v845: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:38:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:38:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:38:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:38:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:38:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:38:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v846: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec  1 04:38:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2472710661' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 04:38:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec  1 04:38:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2472710661' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #39. Immutable memtables: 0.
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.544361) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 39
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581925544416, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1942, "num_deletes": 268, "total_data_size": 1984584, "memory_usage": 2022936, "flush_reason": "Manual Compaction"}
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #40: started
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581925554735, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 40, "file_size": 1392693, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15622, "largest_seqno": 17563, "table_properties": {"data_size": 1385258, "index_size": 4253, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 18034, "raw_average_key_size": 21, "raw_value_size": 1369345, "raw_average_value_size": 1626, "num_data_blocks": 191, "num_entries": 842, "num_filter_entries": 842, "num_deletions": 268, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764581783, "oldest_key_time": 1764581783, "file_creation_time": 1764581925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 40, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 10421 microseconds, and 5091 cpu microseconds.
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.554786) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #40: 1392693 bytes OK
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.554807) [db/memtable_list.cc:519] [default] Level-0 commit table #40 started
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.556438) [db/memtable_list.cc:722] [default] Level-0 commit table #40: memtable #1 done
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.556455) EVENT_LOG_v1 {"time_micros": 1764581925556449, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.556479) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 1976074, prev total WAL file size 1976074, number of live WAL files 2.
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000036.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.557357) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353031' seq:72057594037927935, type:22 .. '6D67727374617400373532' seq:0, type:0; will stop at (end)
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [40(1360KB)], [38(5489KB)]
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581925557454, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [40], "files_L6": [38], "score": -1, "input_data_size": 7013851, "oldest_snapshot_seqno": -1}
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #41: 3871 keys, 5465875 bytes, temperature: kUnknown
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581925599548, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 41, "file_size": 5465875, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5438996, "index_size": 16082, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9733, "raw_key_size": 91208, "raw_average_key_size": 23, "raw_value_size": 5368411, "raw_average_value_size": 1386, "num_data_blocks": 695, "num_entries": 3871, "num_filter_entries": 3871, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764581925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.599812) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 5465875 bytes
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.600941) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 166.3 rd, 129.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 5.4 +0.0 blob) out(5.2 +0.0 blob), read-write-amplify(9.0) write-amplify(3.9) OK, records in: 4341, records dropped: 470 output_compression: NoCompression
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.600958) EVENT_LOG_v1 {"time_micros": 1764581925600949, "job": 18, "event": "compaction_finished", "compaction_time_micros": 42173, "compaction_time_cpu_micros": 18639, "output_level": 6, "num_output_files": 1, "total_output_size": 5465875, "num_input_records": 4341, "num_output_records": 3871, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000040.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581925601313, "job": 18, "event": "table_file_deletion", "file_number": 40}
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581925602485, "job": 18, "event": "table_file_deletion", "file_number": 38}
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.557164) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.602638) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.602646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.602647) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.602649) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:38:45 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.602651) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:38:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v847: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v848: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v849: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:50 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:38:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v850: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v851: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:54 np0005540741 podman[258708]: 2025-12-01 09:38:54.98557644 +0000 UTC m=+0.075009269 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.license=GPLv2)
Dec  1 04:38:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:38:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v852: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v853: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:38:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v854: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:00 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:39:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v855: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:03 np0005540741 podman[258729]: 2025-12-01 09:39:03.001627874 +0000 UTC m=+0.102146140 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  1 04:39:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v856: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:05 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:39:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v857: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v858: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:08 np0005540741 podman[258755]: 2025-12-01 09:39:08.969159342 +0000 UTC m=+0.063781057 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  1 04:39:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v859: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:10 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:39:10 np0005540741 systemd-logind[788]: New session 52 of user zuul.
Dec  1 04:39:10 np0005540741 systemd[1]: Started Session 52 of User zuul.
Dec  1 04:39:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v860: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:39:13
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'cephfs.cephfs.meta', 'cephfs.cephfs.data', 'images', 'volumes', 'vms', '.mgr']
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14702 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:39:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v861: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:14 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14704 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:39:14 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status"} v 0) v1
Dec  1 04:39:14 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/536856944' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec  1 04:39:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:39:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v862: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v863: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:39:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:39:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:39:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:39:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec  1 04:39:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:39:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec  1 04:39:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:39:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:39:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:39:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec  1 04:39:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:39:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:39:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:39:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:39:19 np0005540741 ovs-vsctl[259082]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  1 04:39:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v864: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:39:20.477 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:39:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:39:20.479 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:39:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:39:20.479 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:39:20 np0005540741 virtqemud[250400]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  1 04:39:20 np0005540741 virtqemud[250400]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  1 04:39:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:39:20 np0005540741 virtqemud[250400]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  1 04:39:21 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: cache status {prefix=cache status} (starting...)
Dec  1 04:39:21 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: client ls {prefix=client ls} (starting...)
Dec  1 04:39:21 np0005540741 lvm[259415]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec  1 04:39:21 np0005540741 lvm[259415]: VG ceph_vg2 finished
Dec  1 04:39:21 np0005540741 lvm[259421]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec  1 04:39:21 np0005540741 lvm[259421]: VG ceph_vg1 finished
Dec  1 04:39:21 np0005540741 lvm[259427]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:39:21 np0005540741 lvm[259427]: VG ceph_vg0 finished
Dec  1 04:39:21 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14708 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:39:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v865: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:21 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: damage ls {prefix=damage ls} (starting...)
Dec  1 04:39:22 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14710 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:39:22 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump loads {prefix=dump loads} (starting...)
Dec  1 04:39:22 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec  1 04:39:22 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec  1 04:39:22 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec  1 04:39:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "report"} v 0) v1
Dec  1 04:39:22 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1950735095' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec  1 04:39:22 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec  1 04:39:22 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14716 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:39:22 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:39:22.859+0000 7fd2d6503640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec  1 04:39:22 np0005540741 ceph-mgr[75324]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec  1 04:39:22 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec  1 04:39:22 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:39:22 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1596225968' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:39:23 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec  1 04:39:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Dec  1 04:39:23 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2498549434' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec  1 04:39:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config log"} v 0) v1
Dec  1 04:39:23 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3136623468' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec  1 04:39:23 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: ops {prefix=ops} (starting...)
Dec  1 04:39:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v866: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Dec  1 04:39:23 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/670763351' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  1 04:39:23 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Dec  1 04:39:23 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3654002977' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec  1 04:39:24 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: session ls {prefix=session ls} (starting...)
Dec  1 04:39:24 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: status {prefix=status} (starting...)
Dec  1 04:39:24 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14730 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:39:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Dec  1 04:39:24 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4087764241' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  1 04:39:24 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14734 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:39:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Dec  1 04:39:24 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1911139208' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  1 04:39:24 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Dec  1 04:39:24 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3965947463' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  1 04:39:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "features"} v 0) v1
Dec  1 04:39:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/404477119' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec  1 04:39:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Dec  1 04:39:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/122866841' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec  1 04:39:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Dec  1 04:39:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2845613920' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec  1 04:39:25 np0005540741 podman[259965]: 2025-12-01 09:39:25.53723352 +0000 UTC m=+0.104881639 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  1 04:39:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:39:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v867: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:25 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14744 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:39:25 np0005540741 ceph-mgr[75324]: mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Dec  1 04:39:25 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:39:25.861+0000 7fd2d6503640 -1 mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Dec  1 04:39:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Dec  1 04:39:25 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/428750214' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  1 04:39:26 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Dec  1 04:39:26 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2199971143' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec  1 04:39:26 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14750 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:39:26 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14754 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:39:26 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Dec  1 04:39:26 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3996741100' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec  1 04:39:27 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14756 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:39:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Dec  1 04:39:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1132193302' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  1 04:39:27 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14760 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:39:27 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Dec  1 04:39:27 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2888214778' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  1 04:39:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v868: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 55238656 unmapped: 3424256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 45 handle_osd_map epochs [46,46], i have 45, src has [1,46]
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 45 handle_osd_map epochs [46,46], i have 46, src has [1,46]
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.891715050s of 14.028326988s, submitted: 204
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000128 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000016 1 0.000034
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000233 1 0.000059
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000065 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000013
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000106 1 0.000040
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000093 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000017 1 0.000032
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000182 1 0.000050
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000044 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000022
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000112 1 0.000042
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000083 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000010 1 0.000019
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000130 1 0.000037
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000106 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000018
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000137 1 0.000033
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000036 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000012 1 0.000025
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000113 1 0.000041
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000048 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000019
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000175 1 0.000044
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000030 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000026 1 0.000033
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000155 1 0.000030
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000022 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000011 1 0.000018
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000068 1 0.000037
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000036 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000075 1 0.000039
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000023 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000014
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000064 1 0.000041
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000046 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000018
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000078 1 0.000036
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000021 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000016
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000075 1 0.000032
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000018 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000012
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000052 1 0.000028
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000025 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000013
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000077 1 0.000047
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.927200 13 0.000080
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.936561 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.936641 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.936796 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073251724s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184906006s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073219299s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184906006s@ mbc={}] exit Reset 0.000064 1 0.000080
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073219299s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184906006s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073219299s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184906006s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073219299s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184906006s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073219299s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184906006s@ mbc={}] exit Start 0.000010 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073219299s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184906006s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.901289 7 0.000281
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.913195 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.913496 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.913534 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098457336s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210304260s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098435402s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210304260s@ mbc={}] exit Reset 0.000039 1 0.000070
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098435402s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210304260s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098435402s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210304260s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098435402s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210304260s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098435402s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210304260s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098435402s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210304260s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.896487 7 0.000073
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.909281 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.909484 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.909517 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103429794s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215400696s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103402138s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215400696s@ mbc={}] exit Reset 0.000041 1 0.000066
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103402138s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215400696s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103402138s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215400696s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103402138s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215400696s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103402138s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215400696s@ mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103402138s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215400696s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.927773 13 0.000115
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.937353 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.937427 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.937494 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072329521s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184417725s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072308540s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184417725s@ mbc={}] exit Reset 0.000032 1 0.000063
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072308540s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184417725s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072308540s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184417725s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072308540s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184417725s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072308540s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184417725s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072308540s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184417725s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.927897 13 0.000074
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.937447 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.937553 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.937689 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072682381s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184875488s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072663307s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184875488s@ mbc={}] exit Reset 0.000028 1 0.000051
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072663307s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184875488s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072663307s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184875488s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072663307s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184875488s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072663307s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184875488s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072663307s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184875488s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.928054 13 0.000084
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.937620 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.937725 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.937852 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072598457s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184898376s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072576523s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184898376s@ mbc={}] exit Reset 0.000032 1 0.000051
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072576523s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184898376s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072576523s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184898376s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072576523s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184898376s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072576523s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184898376s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072576523s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184898376s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.928159 13 0.000106
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.937739 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.937989 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.938036 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071966171s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184402466s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071948051s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184402466s@ mbc={}] exit Reset 0.000028 1 0.000048
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071948051s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184402466s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071948051s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184402466s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071948051s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184402466s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071948051s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184402466s@ mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071948051s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184402466s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902126 7 0.000057
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.913748 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.913867 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.913890 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097693443s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210258484s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097675323s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210258484s@ mbc={}] exit Reset 0.000028 1 0.000054
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097675323s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210258484s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097675323s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210258484s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097675323s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210258484s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097675323s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210258484s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097675323s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210258484s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.928506 13 0.000065
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.938251 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.938347 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.938376 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071633339s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184318542s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071617126s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] exit Reset 0.000040 1 0.000062
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071617126s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071617126s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071617126s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071617126s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071617126s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902190 7 0.000240
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.913823 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.913879 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.913893 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097608566s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210380554s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097588539s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210380554s@ mbc={}] exit Reset 0.000045 1 0.000048
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097588539s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210380554s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097588539s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210380554s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097588539s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210380554s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097588539s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210380554s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097588539s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210380554s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902343 7 0.000331
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.913857 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.913959 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.913983 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097500801s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210418701s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097480774s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210418701s@ mbc={}] exit Reset 0.000030 1 0.000050
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097480774s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210418701s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097480774s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210418701s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097480774s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210418701s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097480774s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210418701s@ mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097480774s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210418701s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.006040 2 0.000070
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000015 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005844 2 0.000062
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005320 2 0.000075
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005044 2 0.000043
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004706 2 0.000035
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004729 2 0.000030
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004437 2 0.000041
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005088 2 0.000033
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000008 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004149 2 0.000055
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004079 2 0.000025
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003848 2 0.000026
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003674 2 0.000034
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003496 2 0.000025
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004454 2 0.000034
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003915 2 0.000050
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.931495 13 0.000100
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.941326 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.905011 7 0.000061
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916481 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.916656 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.916725 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.931678 13 0.000097
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.941587 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.905011 7 0.000053
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.905101 7 0.000075
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.941679 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916535 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.916624 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916663 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.916697 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.941501 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.916744 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.941630 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.916761 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068525314s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184318542s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094830513s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210655212s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068471909s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] exit Reset 0.000091 1 0.000226
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068471909s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094796181s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210655212s@ mbc={}] exit Reset 0.000098 1 0.000147
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068471909s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094977379s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210678101s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068471909s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094796181s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210655212s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094796181s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210655212s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068471909s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] exit Start 0.000010 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068471909s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094768524s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210685730s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094736099s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210678101s@ mbc={}] exit Reset 0.000259 1 0.000292
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094736099s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210678101s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094736099s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210678101s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094737053s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210685730s@ mbc={}] exit Reset 0.000193 1 0.000236
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094736099s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210678101s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094736099s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210678101s@ mbc={}] exit Start 0.000007 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094737053s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210685730s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094736099s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210678101s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094737053s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210685730s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094737053s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210685730s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094737053s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210685730s@ mbc={}] exit Start 0.000009 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094737053s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210685730s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.905373 7 0.000314
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.931759 13 0.000080
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.942760 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916500 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.942846 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.916655 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.942873 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.916761 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.931929 13 0.000128
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.942272 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.942359 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.942388 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094796181s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210655212s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.941706 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094796181s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210655212s@ mbc={}] exit Start 0.000249 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067185402s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183311462s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094796181s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210655212s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067164421s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183311462s@ mbc={}] exit Reset 0.000044 1 0.000066
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067219734s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183364868s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067164421s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183311462s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067164421s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183311462s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067164421s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183311462s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067164421s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183311462s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067164421s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183311462s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067193031s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183364868s@ mbc={}] exit Reset 0.000046 1 0.000480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067193031s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183364868s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067193031s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183364868s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067193031s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183364868s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067193031s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183364868s@ mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094496727s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210739136s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.905435 7 0.000041
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094424248s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210739136s@ mbc={}] exit Reset 0.000226 1 0.000256
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094424248s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210739136s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094424248s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210739136s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094424248s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210739136s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094424248s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210739136s@ mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094424248s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210739136s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.932575 13 0.000084
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.943519 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.943593 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.943617 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065610886s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181999207s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065589905s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181999207s@ mbc={}] exit Reset 0.000041 1 0.000062
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065589905s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181999207s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065589905s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181999207s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065589905s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181999207s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065589905s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181999207s@ mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065589905s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181999207s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.932457 13 0.000106
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.942469 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.942727 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.942780 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916490 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.916803 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.916832 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094371796s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210922241s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.905589 7 0.000045
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916482 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094347954s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210922241s@ mbc={}] exit Reset 0.000042 1 0.000293
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.916589 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094347954s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210922241s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094347954s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210922241s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094347954s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210922241s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094347954s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210922241s@ mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066887856s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183380127s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094347954s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210922241s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066744804s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183380127s@ mbc={}] exit Reset 0.000163 1 0.000184
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066744804s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183380127s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066744804s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183380127s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066744804s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183380127s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066744804s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183380127s@ mbc={}] exit Start 0.000007 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066744804s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183380127s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.932766 13 0.000112
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.943042 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.944461 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.944511 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066463470s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183166504s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068760872s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184867859s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.905785 7 0.000060
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916663 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.916781 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.916818 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068104744s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184867859s@ mbc={}] exit Reset 0.000684 1 0.000706
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068104744s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184867859s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068104744s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184867859s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068104744s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184867859s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094186783s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210975647s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068104744s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184867859s@ mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068104744s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184867859s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094164848s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210975647s@ mbc={}] exit Reset 0.000040 1 0.000062
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094164848s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210975647s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094164848s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210975647s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094164848s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210975647s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094164848s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210975647s@ mbc={}] exit Start 0.000007 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094164848s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210975647s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.933300 13 0.000347
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.944706 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.944774 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.944797 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064965248s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181869507s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064947128s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181869507s@ mbc={}] exit Reset 0.000049 1 0.000052
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064947128s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181869507s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064947128s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181869507s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064947128s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181869507s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064947128s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181869507s@ mbc={}] exit Start 0.000007 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064947128s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181869507s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066201210s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183166504s@ mbc={}] exit Reset 0.000279 1 0.000299
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066201210s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183166504s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066201210s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183166504s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066201210s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183166504s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066201210s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183166504s@ mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066201210s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183166504s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.906035 7 0.000063
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916829 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.917056 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.917080 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.933197 13 0.000086
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.943520 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093912125s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210968018s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067193031s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183364868s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.944857 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.945145 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065909386s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.182998657s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093876839s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210968018s@ mbc={}] exit Reset 0.000051 1 0.000072
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093876839s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210968018s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093876839s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210968018s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093876839s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210968018s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065887451s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.182998657s@ mbc={}] exit Reset 0.000037 1 0.000055
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093876839s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210968018s@ mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093876839s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210968018s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065887451s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.182998657s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065887451s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.182998657s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065887451s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.182998657s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.906228 7 0.000041
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065887451s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.182998657s@ mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065887451s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.182998657s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.933440 13 0.000095
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.944582 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.944739 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.945353 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902068 7 0.000032
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.934205 13 0.000113
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.945186 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.945369 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.945427 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064443588s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181739807s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064415932s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181739807s@ mbc={}] exit Reset 0.000069 1 0.000078
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064415932s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181739807s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064415932s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181739807s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064415932s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181739807s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064415932s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181739807s@ mbc={}] exit Start 0.000007 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064415932s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181739807s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064523697s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181938171s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064498901s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181938171s@ mbc={}] exit Reset 0.000210 1 0.000278
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064498901s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181938171s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064498901s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181938171s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064498901s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181938171s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064498901s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181938171s@ mbc={}] exit Start 0.000016 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064498901s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181938171s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902227 7 0.000072
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.915218 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.915428 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.916702 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.915500 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.934578 13 0.000073
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916974 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093406677s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210983276s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.945615 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.917499 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.945746 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.945836 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093382835s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210983276s@ mbc={}] exit Reset 0.000050 1 0.001111
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093382835s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210983276s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.917531 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093382835s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210983276s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093382835s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210983276s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093382835s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210983276s@ mbc={}] exit Start 0.000007 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063798904s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181411743s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093382835s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210983276s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063775063s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181411743s@ mbc={}] exit Reset 0.000042 1 0.000070
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093309402s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210937500s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063775063s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181411743s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063775063s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181411743s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063775063s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181411743s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063775063s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181411743s@ mbc={}] exit Start 0.000007 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063775063s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181411743s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093281746s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210937500s@ mbc={}] exit Reset 0.000050 1 0.000527
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093281746s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210937500s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.915215 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.935918 13 0.000126
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.945905 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.946068 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.946138 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093281746s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210937500s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093281746s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210937500s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093281746s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210937500s@ mbc={}] exit Start 0.000106 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093281746s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210937500s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.936706 13 0.000092
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063556671s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181381226s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.946681 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.946855 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.946886 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063532829s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181381226s@ mbc={}] exit Reset 0.000112 1 0.000129
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063532829s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181381226s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063532829s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181381226s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.917584 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063532829s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181381226s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063532829s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181381226s@ mbc={}] exit Start 0.000010 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.917639 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063532829s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181381226s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063310623s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181175232s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902640 7 0.000041
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.915777 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097242355s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215141296s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063269615s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181175232s@ mbc={}] exit Reset 0.000058 1 0.000107
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.915861 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063269615s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181175232s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.915895 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063269615s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181175232s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063269615s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181175232s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063269615s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181175232s@ mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063269615s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181175232s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097218513s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215141296s@ mbc={}] exit Reset 0.000045 1 0.000688
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097218513s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215141296s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097218513s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215141296s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097218513s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215141296s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097218513s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215141296s@ mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097218513s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215141296s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097195625s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215148926s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097170830s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215148926s@ mbc={}] exit Reset 0.000060 1 0.000082
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097170830s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215148926s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097170830s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215148926s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097170830s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215148926s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097170830s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215148926s@ mbc={}] exit Start 0.000007 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.936189 13 0.000135
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.942584 13 0.000151
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097170830s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215148926s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.946416 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.946739 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.947109 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902742 7 0.000068
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.946948 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.947146 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.915546 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.915715 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.915758 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097122192s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215202332s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097093582s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] exit Reset 0.000047 1 0.000099
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097093582s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097093582s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097093582s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097093582s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] exit Start 0.000023 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097093582s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902808 7 0.000039
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.915626 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.915768 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.915805 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097187996s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215385437s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.935135 13 0.000118
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.946832 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.947337 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.947380 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097146034s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] exit Reset 0.000064 1 0.000096
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097146034s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097146034s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097146034s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097146034s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] exit Start 0.000015 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097146034s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902914 7 0.000043
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.915607 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.915855 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.915895 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097031593s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215385437s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097012520s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] exit Reset 0.000047 1 0.000059
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.947032 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097012520s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097012520s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097012520s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097012520s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097012520s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057223320s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.175636292s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057195663s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.175636292s@ mbc={}] exit Reset 0.000052 1 0.000440
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057195663s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.175636292s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057195663s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.175636292s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057195663s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.175636292s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057195663s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.175636292s@ mbc={}] exit Start 0.000018 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057195663s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.175636292s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096585274s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215202332s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063093185s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181732178s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096489906s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] exit Reset 0.001166 1 0.001257
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096489906s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096489906s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096489906s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096489906s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] exit Start 0.000010 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096489906s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062900543s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181732178s@ mbc={}] exit Reset 0.000589 1 0.000617
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062900543s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181732178s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062900543s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181732178s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062900543s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181732178s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007565 2 0.000024
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062900543s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181732178s@ mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062900543s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181732178s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063192368s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181236267s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062316895s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181236267s@ mbc={}] exit Reset 0.000900 1 0.000919
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062316895s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181236267s@ mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062316895s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181236267s@ mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062316895s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181236267s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062316895s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181236267s@ mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062316895s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181236267s@ mbc={}] enter Started/Stray
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 46 handle_osd_map epochs [46,46], i have 46, src has [1,46]
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000130 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000013 1 0.000036
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000013 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000158 1 0.000063
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000026 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000069 1 0.000036
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000056 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000012 1 0.000020
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000114 1 0.000037
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000031 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000020
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000083 1 0.000056
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000038 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000082 1 0.000030
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000057 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000010 1 0.000024
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000033 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000103 1 0.000065
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000144 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000038
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000067 1 0.000045
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000051 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000051 1 0.000026
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000056 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000018
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000103 1 0.000035
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000048 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000094 1 0.000030
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000043 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000080 1 0.000028
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000040 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000016
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000122 1 0.000049
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000048 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000010 1 0.000019
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000108 1 0.000062
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000050 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000012
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000057 1 0.000039
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000035 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000013
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000150 1 0.000031
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000069 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000017
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000010 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000094 1 0.000039
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000025 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000105
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000075 1 0.000032
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000040 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000010 1 0.000036
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000088 1 0.000041
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000031 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000012
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000063 1 0.000029
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007334 2 0.000044
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007772 2 0.000056
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.006756 2 0.000044
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.006317 2 0.000026
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005922 2 0.000037
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.006610 2 0.000032
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005573 2 0.000034
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005349 2 0.000022
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004919 2 0.000035
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004597 2 0.000028
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003814 2 0.000033
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003272 2 0.000033
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003008 2 0.000033
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003657 2 0.000027
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000007 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002087 2 0.000047
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004284 2 0.000035
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002499 2 0.000024
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004733 2 0.000031
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002034 2 0.000051
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e(unlocked)] enter Initial
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000031 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000018
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000082 1 0.000040
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000381 2 0.000033
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56541184 unmapped: 2121728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 46 handle_osd_map epochs [46,47], i have 46, src has [1,47]
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 46 handle_osd_map epochs [47,47], i have 47, src has [1,47]
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075068 2 0.000056
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.081146 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075117 2 0.000068
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.081575 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075086 2 0.000051
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.081837 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075547 2 0.000042
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.082455 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075663 2 0.000043
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.083633 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075773 2 0.000057
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.083223 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.111282 2 0.000022
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.115318 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.107981 2 0.000039
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.115635 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.112342 2 0.000023
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.115950 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.112443 2 0.000021
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.116227 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075826 2 0.000029
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.081508 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075818 2 0.000038
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.080882 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075917 2 0.000027
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.081343 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075925 2 0.000031
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.080649 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075549 2 0.000024
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.080394 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075644 2 0.000025
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.080082 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075847 2 0.000039
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.079807 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075821 2 0.000035
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.079573 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075908 2 0.000033
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.079364 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.113150 2 0.000026
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.117095 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.113265 2 0.000030
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.117452 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.113864 2 0.000026
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.118209 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.113461 2 0.000030
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.118026 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.114150 2 0.000043
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.119035 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.114102 2 0.000031
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.118750 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.114205 2 0.000041
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.119468 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.076687 2 0.000045
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.079852 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.115195 2 0.000031
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.120072 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.073550 2 0.000035
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.074052 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.115579 2 0.000041
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.120785 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.077176 2 0.000028
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.079778 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.115798 2 0.000032
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.121333 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.115927 2 0.000034
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.121906 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.116092 2 0.000062
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.122422 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.077217 2 0.000030
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.079347 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.077516 2 0.000033
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.079728 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 handle_osd_map epochs [47,47], i have 47, src has [1,47]
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007370 4 0.000160
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007457 4 0.000140
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000011 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007439 4 0.000289
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008189 4 0.000091
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018105 4 0.000098
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018194 4 0.000073
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017876 4 0.000060
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018159 4 0.000293
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018094 4 0.000049
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018756 4 0.000059
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000003 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018734 4 0.000052
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018690 4 0.000041
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018667 4 0.000040
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018633 4 0.000038
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018556 4 0.000038
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000003 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018509 4 0.000035
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.019009 4 0.000053
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018700 4 0.000041
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018740 4 0.000056
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018873 4 0.000039
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000024 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018347 4 0.000054
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018380 4 0.000039
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018391 4 0.000080
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018267 4 0.000043
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018661 4 0.000355
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018917 4 0.000435
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018369 4 0.000056
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018497 4 0.000856
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018039 4 0.000076
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000015 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017968 4 0.000048
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000003 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018036 4 0.000092
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017902 4 0.000044
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017853 4 0.000039
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000003 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017807 4 0.000064
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017805 4 0.000050
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017746 4 0.000051
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000544 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.140052 7 0.000054
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.139802 7 0.000110
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.139496 7 0.000070
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.140735 7 0.000123
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000129 1 0.000067
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.139764 7 0.000069
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000155 1 0.000032
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.139899 7 0.000056
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000292 1 0.000021
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000414 1 0.000086
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000273 1 0.000026
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000299 1 0.000036
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.140516 7 0.000919
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000077 1 0.000043
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: unregistering
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.c deep-scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.146425 7 0.000106
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.146103 7 0.000085
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.145670 7 0.000499
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.144601 7 0.000080
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000074 1 0.000035
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000176 1 0.000016
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000213 1 0.000015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000240 1 0.000014
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.145949 7 0.000067
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.145822 7 0.000148
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.146021 7 0.000126
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.146573 7 0.000080
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.145554 7 0.000077
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.145089 7 0.000068
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.145401 7 0.000061
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.145081 7 0.000093
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.144349 7 0.000091
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.144976 7 0.000124
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.144869 7 0.000099
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.144762 7 0.000080
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.151294 7 0.000069
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000243 1 0.000026
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000265 1 0.000013
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000360 1 0.000012
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000556 1 0.000013
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000610 1 0.000013
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000649 1 0.000016
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000675 1 0.000016
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001032 1 0.000018
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001162 1 0.000017
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001198 1 0.000015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001225 1 0.000012
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001259 1 0.000013
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001274 1 0.000022
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.151282 7 0.000075
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.151377 7 0.000076
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.154835 7 0.000071
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.150218 7 0.000055
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.149424 7 0.000054
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.149584 7 0.000196
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.149767 7 0.000118
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000133 1 0.000057
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000188 1 0.000017
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.151421 7 0.000299
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000219 1 0.000016
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000240 1 0.000015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.150994 7 0.000064
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.151490 7 0.000115
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.149212 7 0.000086
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.151109 7 0.000070
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.151119 7 0.000069
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.155768 7 0.000047
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.155903 7 0.000044
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.156051 7 0.000056
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.149122 7 0.000093
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.150711 7 0.000098
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000726 1 0.000014
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.149342 7 0.000078
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000835 1 0.000014
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000661 1 0.000015
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000708 1 0.000012
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001152 1 0.000011
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001359 1 0.000435
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000980 1 0.000025
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001057 1 0.000027
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001529 1 0.000350
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001246 1 0.000014
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001466 1 0.000023
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001580 1 0.000062
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001749 1 0.000240
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001527 1 0.000636
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.002555 1 0.001843
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.c deep-scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.029881 1 0.000052
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.030055 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.170150 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.034098 1 0.000057
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.034279 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.173809 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.041486 1 0.000034
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.041824 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.182592 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.048864 1 0.000028
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.049336 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.189180 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.056115 1 0.000019
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.056430 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.196228 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.063735 1 0.000038
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.064086 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.204024 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.067404 1 0.000041
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.067541 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.208131 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.068841 1 0.000065
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.068950 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.215421 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.076196 1 0.000023
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.076410 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.222547 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.083285 1 0.000022
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.083525 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.229223 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.090615 1 0.000013
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.090893 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.235530 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.098030 1 0.000025
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.098311 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.244293 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.104882 1 0.000041
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.105194 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.251065 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.112211 1 0.000022
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.112598 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.258646 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.119568 1 0.000029
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.120158 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.266769 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.127023 1 0.000021
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.127678 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.273270 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56877056 unmapped: 1785856 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.134283 1 0.000026
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.134963 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.280081 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.141738 1 0.000025
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.142479 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.287929 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.148539 1 0.000028
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.149606 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.294725 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.156023 1 0.000031
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.157227 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.301617 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.163234 1 0.000019
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.164462 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.309482 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.170455 1 0.000025
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.171714 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.316617 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.178009 1 0.000023
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.179301 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.324087 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.185443 1 0.000021
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.186761 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.338090 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.189753 1 0.000050
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.189918 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.341246 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.197090 1 0.000035
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.197323 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.352186 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.204359 1 0.000024
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.204620 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.354882 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.211785 1 0.000028
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.212056 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.361506 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.218549 1 0.000041
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.219320 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.369043 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.225795 1 0.000063
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.226664 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.376468 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.233286 1 0.000035
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.233984 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.385005 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.240612 1 0.000382
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.241351 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.392874 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.247489 1 0.000049
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.248678 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.397935 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.254868 1 0.000036
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.256263 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.407966 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.262424 1 0.000044
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.263453 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.419245 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.269418 1 0.000038
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.270982 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.422128 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.277107 1 0.000122
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.278244 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.434202 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.284107 1 0.000034
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.285407 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.441502 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.291503 1 0.000077
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.293041 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.443788 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.298705 1 0.000035
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.300337 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.449499 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.306096 1 0.000027
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.307889 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.459064 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.313598 1 0.000023
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.315171 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.464571 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.320802 1 0.000020
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.323399 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.474831 0 0.000000
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 323898 data_alloc: 218103808 data_used: 0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56909824 unmapped: 1753088 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.e scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.e scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56975360 unmapped: 1687552 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57139200 unmapped: 1523712 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57171968 unmapped: 1490944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57171968 unmapped: 1490944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 343379 data_alloc: 218103808 data_used: 0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57180160 unmapped: 1482752 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.10 deep-scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.10 deep-scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56983552 unmapped: 1679360 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56983552 unmapped: 1679360 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.12 deep-scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.735406876s of 10.144852638s, submitted: 376
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.12 deep-scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56983552 unmapped: 1679360 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56934400 unmapped: 1728512 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 345591 data_alloc: 218103808 data_used: 0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56934400 unmapped: 1728512 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56934400 unmapped: 1728512 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56942592 unmapped: 1720320 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1a scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1a scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56942592 unmapped: 1720320 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56950784 unmapped: 1712128 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1e scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1e scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 347887 data_alloc: 218103808 data_used: 0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57040896 unmapped: 1622016 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57065472 unmapped: 1597440 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57065472 unmapped: 1597440 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.6 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.991518974s of 10.020858765s, submitted: 8
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.6 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57073664 unmapped: 1589248 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57073664 unmapped: 1589248 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 350181 data_alloc: 218103808 data_used: 0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57114624 unmapped: 1548288 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.a scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.a scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57122816 unmapped: 1540096 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.b scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.b scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57131008 unmapped: 1531904 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57139200 unmapped: 1523712 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57139200 unmapped: 1523712 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.d scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.d scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 353622 data_alloc: 218103808 data_used: 0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57171968 unmapped: 1490944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57196544 unmapped: 1466368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.e scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.e scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57196544 unmapped: 1466368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57196544 unmapped: 1466368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57204736 unmapped: 1458176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 354769 data_alloc: 218103808 data_used: 0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57204736 unmapped: 1458176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57221120 unmapped: 1441792 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57221120 unmapped: 1441792 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57221120 unmapped: 1441792 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57229312 unmapped: 1433600 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.10 deep-scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.091085434s of 17.134098053s, submitted: 12
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 355917 data_alloc: 218103808 data_used: 0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.10 deep-scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57253888 unmapped: 1409024 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57262080 unmapped: 1400832 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.17 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.17 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57262080 unmapped: 1400832 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57270272 unmapped: 1392640 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57278464 unmapped: 1384448 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1b scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1b scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 358213 data_alloc: 218103808 data_used: 0
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57286656 unmapped: 1376256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57286656 unmapped: 1376256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57286656 unmapped: 1376256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57294848 unmapped: 1368064 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57303040 unmapped: 1359872 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 359013 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57303040 unmapped: 1359872 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57311232 unmapped: 1351680 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1c scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.958663940s of 11.982688904s, submitted: 6
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1c scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57319424 unmapped: 1343488 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57327616 unmapped: 1335296 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57327616 unmapped: 1335296 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 361309 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57327616 unmapped: 1335296 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57344000 unmapped: 1318912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57344000 unmapped: 1318912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57352192 unmapped: 1310720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57352192 unmapped: 1310720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 363605 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57352192 unmapped: 1310720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57368576 unmapped: 1294336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.f scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.919653893s of 10.155382156s, submitted: 10
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.f scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57368576 unmapped: 1294336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57376768 unmapped: 1286144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57376768 unmapped: 1286144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 365900 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57376768 unmapped: 1286144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57384960 unmapped: 1277952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57384960 unmapped: 1277952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57393152 unmapped: 1269760 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57393152 unmapped: 1269760 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.e scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.e scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 367047 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57401344 unmapped: 1261568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57401344 unmapped: 1261568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57401344 unmapped: 1261568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57409536 unmapped: 1253376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57409536 unmapped: 1253376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 367047 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57417728 unmapped: 1245184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57417728 unmapped: 1245184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57425920 unmapped: 1236992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57434112 unmapped: 1228800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57434112 unmapped: 1228800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 367047 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57442304 unmapped: 1220608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57442304 unmapped: 1220608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57442304 unmapped: 1220608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.a scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.934640884s of 20.948490143s, submitted: 4
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.a scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57450496 unmapped: 1212416 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57458688 unmapped: 1204224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 368194 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57458688 unmapped: 1204224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57466880 unmapped: 1196032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57466880 unmapped: 1196032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57483264 unmapped: 1179648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.14 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.14 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57491456 unmapped: 1171456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 371637 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57499648 unmapped: 1163264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.11 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.11 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57507840 unmapped: 1155072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.327646255s of 11.902298927s, submitted: 10
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 373933 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57532416 unmapped: 1130496 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57548800 unmapped: 1114112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.13 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.13 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57548800 unmapped: 1114112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.15 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.15 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57556992 unmapped: 1105920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.1f deep-scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.1f deep-scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57556992 unmapped: 1105920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 377377 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57524224 unmapped: 1138688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57524224 unmapped: 1138688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57524224 unmapped: 1138688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 378525 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57540608 unmapped: 1122304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57548800 unmapped: 1114112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57565184 unmapped: 1097728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57565184 unmapped: 1097728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57565184 unmapped: 1097728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.905238152s of 15.089574814s, submitted: 10
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 379673 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57589760 unmapped: 1073152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57589760 unmapped: 1073152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57589760 unmapped: 1073152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57597952 unmapped: 1064960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57597952 unmapped: 1064960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 380821 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57606144 unmapped: 1056768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57614336 unmapped: 1048576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57622528 unmapped: 1040384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57622528 unmapped: 1040384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57622528 unmapped: 1040384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 381969 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57630720 unmapped: 1032192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57630720 unmapped: 1032192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57638912 unmapped: 1024000 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57638912 unmapped: 1024000 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.969743729s of 13.996441841s, submitted: 6
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57655296 unmapped: 1007616 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 383117 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57655296 unmapped: 1007616 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57655296 unmapped: 1007616 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57663488 unmapped: 999424 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57671680 unmapped: 991232 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57679872 unmapped: 983040 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 384265 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57688064 unmapped: 974848 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57688064 unmapped: 974848 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57696256 unmapped: 966656 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57696256 unmapped: 966656 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.e scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.e scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57704448 unmapped: 958464 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 386560 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57704448 unmapped: 958464 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57712640 unmapped: 950272 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57720832 unmapped: 942080 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.a scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.829962730s of 13.884933472s, submitted: 8
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.a scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57745408 unmapped: 917504 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57761792 unmapped: 901120 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 387707 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57761792 unmapped: 901120 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57761792 unmapped: 901120 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57769984 unmapped: 892928 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57769984 unmapped: 892928 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57778176 unmapped: 884736 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 388854 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57778176 unmapped: 884736 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57778176 unmapped: 884736 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57786368 unmapped: 876544 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.005264282s of 10.019852638s, submitted: 4
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57794560 unmapped: 868352 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57794560 unmapped: 868352 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 390001 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57802752 unmapped: 860160 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57802752 unmapped: 860160 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.2 deep-scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.2 deep-scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57819136 unmapped: 843776 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57827328 unmapped: 835584 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57843712 unmapped: 819200 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.c deep-scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.c deep-scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 393442 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57843712 unmapped: 819200 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.7 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.7 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57851904 unmapped: 811008 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57851904 unmapped: 811008 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57851904 unmapped: 811008 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57860096 unmapped: 802816 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 394589 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57868288 unmapped: 794624 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57868288 unmapped: 794624 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57876480 unmapped: 786432 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57876480 unmapped: 786432 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1a scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.707541466s of 15.794960022s, submitted: 10
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1a scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57884672 unmapped: 778240 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 395737 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57884672 unmapped: 778240 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57884672 unmapped: 778240 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57892864 unmapped: 770048 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57909248 unmapped: 753664 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57917440 unmapped: 745472 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 398032 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57917440 unmapped: 745472 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57917440 unmapped: 745472 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57925632 unmapped: 737280 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1e scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1e scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57925632 unmapped: 737280 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57925632 unmapped: 737280 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 400327 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57933824 unmapped: 729088 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57942016 unmapped: 720896 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57950208 unmapped: 712704 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.e scrub starts
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.057037354s of 14.124808311s, submitted: 10
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.e scrub ok
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57958400 unmapped: 704512 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57958400 unmapped: 704512 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57966592 unmapped: 696320 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57966592 unmapped: 696320 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57982976 unmapped: 679936 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57982976 unmapped: 679936 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57991168 unmapped: 671744 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57991168 unmapped: 671744 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57991168 unmapped: 671744 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57999360 unmapped: 663552 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57999360 unmapped: 663552 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58007552 unmapped: 655360 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58007552 unmapped: 655360 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 647168 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58023936 unmapped: 638976 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58023936 unmapped: 638976 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 630784 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 630784 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58056704 unmapped: 606208 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58056704 unmapped: 606208 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58064896 unmapped: 598016 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58073088 unmapped: 589824 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 581632 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 581632 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 581632 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58089472 unmapped: 573440 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58089472 unmapped: 573440 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 557056 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 557056 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 557056 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 548864 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 548864 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 548864 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58122240 unmapped: 540672 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58130432 unmapped: 532480 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58138624 unmapped: 524288 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58138624 unmapped: 524288 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58138624 unmapped: 524288 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58146816 unmapped: 516096 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58146816 unmapped: 516096 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58155008 unmapped: 507904 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58163200 unmapped: 499712 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 491520 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 491520 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 491520 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 483328 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 483328 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58187776 unmapped: 475136 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58187776 unmapped: 475136 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58187776 unmapped: 475136 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58195968 unmapped: 466944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58195968 unmapped: 466944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58195968 unmapped: 466944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58204160 unmapped: 458752 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58204160 unmapped: 458752 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58212352 unmapped: 450560 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58212352 unmapped: 450560 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58220544 unmapped: 442368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58220544 unmapped: 442368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58220544 unmapped: 442368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 434176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 434176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 434176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58236928 unmapped: 425984 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58236928 unmapped: 425984 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58245120 unmapped: 417792 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58253312 unmapped: 409600 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58253312 unmapped: 409600 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58261504 unmapped: 401408 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58261504 unmapped: 401408 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 393216 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 393216 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 393216 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58277888 unmapped: 385024 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58286080 unmapped: 376832 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 368640 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 368640 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 368640 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58302464 unmapped: 360448 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58302464 unmapped: 360448 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 352256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 352256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 352256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58318848 unmapped: 344064 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58327040 unmapped: 335872 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58335232 unmapped: 327680 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58335232 unmapped: 327680 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58343424 unmapped: 319488 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58343424 unmapped: 319488 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 303104 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 303104 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 303104 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 294912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 294912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58376192 unmapped: 286720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58384384 unmapped: 278528 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58384384 unmapped: 278528 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 270336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 270336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 270336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 262144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 262144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58408960 unmapped: 253952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58408960 unmapped: 253952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 237568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58433536 unmapped: 229376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58433536 unmapped: 229376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 221184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 221184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 221184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 212992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 212992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 212992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58458112 unmapped: 204800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58458112 unmapped: 204800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 196608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 196608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 196608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 188416 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 188416 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58482688 unmapped: 180224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58482688 unmapped: 180224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 172032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 172032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 155648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 155648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 155648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 114688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 114688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 106496 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 106496 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58564608 unmapped: 98304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58564608 unmapped: 98304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58564608 unmapped: 98304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 90112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 90112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 81920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 81920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 81920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58589184 unmapped: 73728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58597376 unmapped: 65536 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 57344 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 57344 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 57344 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 49152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 49152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 40960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 40960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 32768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 32768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 32768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 24576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 24576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 24576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 16384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 16384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 8192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 8192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 0 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 0 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 0 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58679296 unmapped: 1032192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58679296 unmapped: 1032192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58679296 unmapped: 1032192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58687488 unmapped: 1024000 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58687488 unmapped: 1024000 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58712064 unmapped: 999424 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58712064 unmapped: 999424 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58744832 unmapped: 966656 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58744832 unmapped: 966656 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58753024 unmapped: 958464 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58753024 unmapped: 958464 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58753024 unmapped: 958464 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58769408 unmapped: 942080 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58769408 unmapped: 942080 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58769408 unmapped: 942080 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58777600 unmapped: 933888 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58777600 unmapped: 933888 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58826752 unmapped: 884736 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58826752 unmapped: 884736 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58834944 unmapped: 876544 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58834944 unmapped: 876544 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58875904 unmapped: 835584 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58875904 unmapped: 835584 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58884096 unmapped: 827392 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58884096 unmapped: 827392 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58884096 unmapped: 827392 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58892288 unmapped: 819200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58892288 unmapped: 819200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58900480 unmapped: 811008 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58900480 unmapped: 811008 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58900480 unmapped: 811008 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58916864 unmapped: 794624 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58916864 unmapped: 794624 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58916864 unmapped: 794624 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58941440 unmapped: 770048 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58941440 unmapped: 770048 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58949632 unmapped: 761856 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58949632 unmapped: 761856 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58957824 unmapped: 753664 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59006976 unmapped: 704512 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59006976 unmapped: 704512 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59006976 unmapped: 704512 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59015168 unmapped: 696320 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59015168 unmapped: 696320 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59023360 unmapped: 688128 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59023360 unmapped: 688128 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59080704 unmapped: 630784 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59080704 unmapped: 630784 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59097088 unmapped: 614400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59097088 unmapped: 614400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59138048 unmapped: 573440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59138048 unmapped: 573440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59170816 unmapped: 540672 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.4 total, 600.0 interval#012Cumulative writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 15.87 MB, 0.03 MB/s#012Interval WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.4 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.4 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.4 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14764 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59383808 unmapped: 327680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59383808 unmapped: 327680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59383808 unmapped: 327680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59465728 unmapped: 245760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59465728 unmapped: 245760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: mgrc ms_handle_reset ms_handle_reset con 0x5595d67d3c00
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3312476512
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3312476512,v1:192.168.122.100:6801/3312476512]
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: mgrc handle_mgr_configure stats_period=5
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 ms_handle_reset con 0x5595d6fa8c00 session 0x5595d72c4960
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:27 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.4 total, 600.0 interval#012Cumulative writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.4 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.4 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.4 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, 
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 47 handle_osd_map epochs [48,48], i have 47, src has [1,48]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 1077.389038086s of 1077.396484375s, submitted: 2
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 48 heartbeat osd_stat(store_statfs(0x4fe14e000/0x0/0x4ffc00000, data 0x2f8bc/0x7f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 48 handle_osd_map epochs [49,49], i have 48, src has [1,49]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 9977856 heap: 70074368 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 49 handle_osd_map epochs [50,50], i have 49, src has [1,50]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 50 ms_handle_reset con 0x5595d7ac9400 session 0x5595d73d83c0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61227008 unmapped: 8847360 heap: 70074368 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 50 heartbeat osd_stat(store_statfs(0x4fd065000/0x0/0x4ffc00000, data 0x11124f6/0x1168000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61538304 unmapped: 16932864 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 50 handle_osd_map epochs [51,51], i have 50, src has [1,51]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 51 ms_handle_reset con 0x5595d8136c00 session 0x5595d6e51a40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 540377 data_alloc: 218103808 data_used: 28672
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 540377 data_alloc: 218103808 data_used: 28672
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 51 handle_osd_map epochs [52,52], i have 51, src has [1,52]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 51 handle_osd_map epochs [52,52], i have 52, src has [1,52]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.469475746s of 11.739644051s, submitted: 54
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 24.064493179s of 24.076759338s, submitted: 13
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f8f/0x116f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 52 handle_osd_map epochs [53,53], i have 52, src has [1,53]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 53 ms_handle_reset con 0x5595d8137000 session 0x5595d8062f00
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61677568 unmapped: 16793600 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 53 ms_handle_reset con 0x5595d8137400 session 0x5595d729f680
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 53 ms_handle_reset con 0x5595d8137800 session 0x5595d81a34a0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61661184 unmapped: 16809984 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 53 heartbeat osd_stat(store_statfs(0x4fd05b000/0x0/0x4ffc00000, data 0x1116549/0x1172000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61661184 unmapped: 16809984 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 53 heartbeat osd_stat(store_statfs(0x4fd05b000/0x0/0x4ffc00000, data 0x1116549/0x1172000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 53 handle_osd_map epochs [53,54], i have 53, src has [1,54]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 53 handle_osd_map epochs [54,54], i have 54, src has [1,54]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 555402 data_alloc: 218103808 data_used: 45056
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 54 ms_handle_reset con 0x5595d7ac9400 session 0x5595d81a3e00
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62808064 unmapped: 15663104 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 54 ms_handle_reset con 0x5595d8137000 session 0x5595d80623c0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 54 ms_handle_reset con 0x5595d8137400 session 0x5595d6e50f00
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62824448 unmapped: 15646720 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 54 heartbeat osd_stat(store_statfs(0x4fd057000/0x0/0x4ffc00000, data 0x1117b13/0x1176000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 54 handle_osd_map epochs [54,55], i have 54, src has [1,55]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 55 ms_handle_reset con 0x5595d8137c00 session 0x5595d6e512c0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62988288 unmapped: 15482880 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62988288 unmapped: 15482880 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 55 handle_osd_map epochs [56,56], i have 55, src has [1,56]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 56 ms_handle_reset con 0x5595d9b9e800 session 0x5595d80921e0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64208896 unmapped: 14262272 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 56 heartbeat osd_stat(store_statfs(0x4fd055000/0x0/0x4ffc00000, data 0x1119111/0x1178000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 559201 data_alloc: 218103808 data_used: 45056
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64241664 unmapped: 14229504 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 56 handle_osd_map epochs [56,57], i have 56, src has [1,57]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 57 handle_osd_map epochs [57,57], i have 57, src has [1,57]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 57 ms_handle_reset con 0x5595d9b9e800 session 0x5595d8075680
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64421888 unmapped: 14049280 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64421888 unmapped: 14049280 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.944048882s of 11.445683479s, submitted: 143
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 57 handle_osd_map epochs [57,58], i have 57, src has [1,58]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 57 handle_osd_map epochs [58,58], i have 58, src has [1,58]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 58 ms_handle_reset con 0x5595d9b9ec00 session 0x5595d729eb40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 58 ms_handle_reset con 0x5595d7ac9400 session 0x5595d80a52c0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 65781760 unmapped: 21086208 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 58 handle_osd_map epochs [58,59], i have 58, src has [1,59]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 58 handle_osd_map epochs [59,59], i have 59, src has [1,59]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 59 ms_handle_reset con 0x5595d8137c00 session 0x5595d7b001e0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 65945600 unmapped: 20922368 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 59 heartbeat osd_stat(store_statfs(0x4fc84b000/0x0/0x4ffc00000, data 0x191d2db/0x1982000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854268 data_alloc: 218103808 data_used: 61440
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 59 handle_osd_map epochs [60,60], i have 59, src has [1,60]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 60 ms_handle_reset con 0x5595d8137400 session 0x5595d7bae780
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 60 ms_handle_reset con 0x5595d8136c00 session 0x5595d80785a0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 66093056 unmapped: 20774912 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 60 handle_osd_map epochs [60,61], i have 60, src has [1,61]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 61 ms_handle_reset con 0x5595d8137400 session 0x5595d73d83c0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 61 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x391fecd/0x398a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 66355200 unmapped: 20512768 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 61 handle_osd_map epochs [62,62], i have 61, src has [1,62]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 62 ms_handle_reset con 0x5595d8137c00 session 0x5595d8197860
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 62 ms_handle_reset con 0x5595d7ac9400 session 0x5595d729e960
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 66609152 unmapped: 20258816 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 62 handle_osd_map epochs [63,63], i have 62, src has [1,63]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 63 ms_handle_reset con 0x5595d8137000 session 0x5595d729f680
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 63 ms_handle_reset con 0x5595d9b9e800 session 0x5595d81963c0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 63 heartbeat osd_stat(store_statfs(0x4fc83a000/0x0/0x4ffc00000, data 0x11237d9/0x1192000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 63 ms_handle_reset con 0x5595d9b9ec00 session 0x5595d80a4b40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 18751488 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 63 handle_osd_map epochs [63,64], i have 63, src has [1,64]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 64 heartbeat osd_stat(store_statfs(0x4fc426000/0x0/0x4ffc00000, data 0x1124dd4/0x1194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 18751488 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 64 handle_osd_map epochs [64,65], i have 64, src has [1,65]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 65 ms_handle_reset con 0x5595d8136c00 session 0x5595d7b16d20
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 65 ms_handle_reset con 0x5595d7ac9400 session 0x5595d8079a40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 65 ms_handle_reset con 0x5595d8137400 session 0x5595d8196b40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 613840 data_alloc: 218103808 data_used: 122880
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 65 heartbeat osd_stat(store_statfs(0x4fc422000/0x0/0x4ffc00000, data 0x11263ba/0x1197000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 18694144 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 65 handle_osd_map epochs [66,66], i have 65, src has [1,66]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 66 ms_handle_reset con 0x5595d8137400 session 0x5595d729f680
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 18743296 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 66 handle_osd_map epochs [66,67], i have 66, src has [1,67]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 66 handle_osd_map epochs [67,67], i have 67, src has [1,67]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 67 ms_handle_reset con 0x5595d7ac9400 session 0x5595d8092b40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68132864 unmapped: 18735104 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 67 handle_osd_map epochs [67,68], i have 67, src has [1,68]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.158089638s of 10.213048935s, submitted: 251
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68386816 unmapped: 18481152 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 68 ms_handle_reset con 0x5595d8136c00 session 0x5595d7b01680
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 68 handle_osd_map epochs [69,69], i have 68, src has [1,69]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 69 ms_handle_reset con 0x5595d9b9e800 session 0x5595d82c2f00
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 69 ms_handle_reset con 0x5595d9b9ec00 session 0x5595d729e3c0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68444160 unmapped: 18423808 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 69 heartbeat osd_stat(store_statfs(0x4fcc19000/0x0/0x4ffc00000, data 0x112ce2f/0x11a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 69 handle_osd_map epochs [70,70], i have 69, src has [1,70]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 630316 data_alloc: 218103808 data_used: 139264
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 70 ms_handle_reset con 0x5595d7ac9400 session 0x5595d729eb40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 70 ms_handle_reset con 0x5595d8136c00 session 0x5595d729fc20
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68624384 unmapped: 18243584 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 70 handle_osd_map epochs [71,71], i have 70, src has [1,71]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d8137400 session 0x5595d8196f00
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d9b9e800 session 0x5595d73e14a0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d8137c00 session 0x5595d73e1680
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 71 heartbeat osd_stat(store_statfs(0x4fcc12000/0x0/0x4ffc00000, data 0x112f364/0x11a9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d7ac9400 session 0x5595d73d83c0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 71 handle_osd_map epochs [72,72], i have 71, src has [1,72]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 634276 data_alloc: 218103808 data_used: 139264
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68771840 unmapped: 18096128 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 72 handle_osd_map epochs [72,73], i have 72, src has [1,73]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 72 handle_osd_map epochs [73,73], i have 73, src has [1,73]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 73 ms_handle_reset con 0x5595d8136c00 session 0x5595d73d8f00
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 73 heartbeat osd_stat(store_statfs(0x4fcc10000/0x0/0x4ffc00000, data 0x1130982/0x11ad000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68771840 unmapped: 18096128 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 73 heartbeat osd_stat(store_statfs(0x4fba6d000/0x0/0x4ffc00000, data 0x113204f/0x11b0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68788224 unmapped: 18079744 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68788224 unmapped: 18079744 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68788224 unmapped: 18079744 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.005791664s of 11.874329567s, submitted: 231
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 73 handle_osd_map epochs [74,74], i have 73, src has [1,74]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 74 ms_handle_reset con 0x5595d9b9c800 session 0x5595d739d860
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 640656 data_alloc: 218103808 data_used: 151552
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68911104 unmapped: 17956864 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 74 handle_osd_map epochs [74,75], i have 74, src has [1,75]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 75 ms_handle_reset con 0x5595d9b9c400 session 0x5595d81974a0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 17948672 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 75 heartbeat osd_stat(store_statfs(0x4fba64000/0x0/0x4ffc00000, data 0x1134c7d/0x11b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 75 ms_handle_reset con 0x5595d9b9c000 session 0x5595d80a4d20
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68952064 unmapped: 17915904 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 75 heartbeat osd_stat(store_statfs(0x4fba64000/0x0/0x4ffc00000, data 0x1134c7d/0x11b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 75 ms_handle_reset con 0x5595d7ac9400 session 0x5595d7d3a780
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 75 handle_osd_map epochs [75,76], i have 75, src has [1,76]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 76 ms_handle_reset con 0x5595d8136c00 session 0x5595d72b8b40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 69074944 unmapped: 17793024 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 76 handle_osd_map epochs [77,77], i have 76, src has [1,77]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 69099520 unmapped: 17768448 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 77 ms_handle_reset con 0x5595d9b9c000 session 0x5595d729e3c0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 654076 data_alloc: 218103808 data_used: 155648
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 77 handle_osd_map epochs [78,78], i have 77, src has [1,78]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 69255168 unmapped: 17612800 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 78 ms_handle_reset con 0x5595d9b9c400 session 0x5595d7b16b40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 78 handle_osd_map epochs [79,79], i have 78, src has [1,79]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 16621568 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 79 heartbeat osd_stat(store_statfs(0x4fba59000/0x0/0x4ffc00000, data 0x113a3c8/0x11c3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 16621568 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 16621568 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c800 session 0x5595d72c52c0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d8136c00 session 0x5595d73d92c0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d7ac9400 session 0x5595d8197680
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c000 session 0x5595d8074b40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c400 session 0x5595d729e960
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d72ad000 session 0x5595d80a5a40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70254592 unmapped: 16613376 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d7ac9400 session 0x5595d80a4960
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d8136c00 session 0x5595d7b01e00
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 665586 data_alloc: 218103808 data_used: 172032
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70254592 unmapped: 16613376 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c000 session 0x5595d81a3a40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 79 handle_osd_map epochs [79,80], i have 79, src has [1,80]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.607955933s of 11.038110733s, submitted: 98
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 80 ms_handle_reset con 0x5595d9b9c400 session 0x5595d7b16b40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 80 ms_handle_reset con 0x5595d7348800 session 0x5595d7b165a0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 80 heartbeat osd_stat(store_statfs(0x4fba59000/0x0/0x4ffc00000, data 0x113a3c8/0x11c3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 16662528 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 16662528 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 80 heartbeat osd_stat(store_statfs(0x4fba57000/0x0/0x4ffc00000, data 0x113b8a0/0x11c6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 16662528 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 16646144 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 667454 data_alloc: 218103808 data_used: 172032
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 16646144 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 80 ms_handle_reset con 0x5595d9b9c000 session 0x5595d7bae3c0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70230016 unmapped: 16637952 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 80 handle_osd_map epochs [81,81], i have 80, src has [1,81]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9b9c400 session 0x5595d7b00780
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598400 session 0x5595d8062f00
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598000 session 0x5595d7b00f00
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598800 session 0x5595d80743c0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70279168 unmapped: 16588800 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598000 session 0x5595d72b90e0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 81 handle_osd_map epochs [82,82], i have 81, src has [1,82]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 82 heartbeat osd_stat(store_statfs(0x4fba54000/0x0/0x4ffc00000, data 0x113ce5a/0x11c9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70328320 unmapped: 16539648 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 82 handle_osd_map epochs [83,83], i have 82, src has [1,83]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9598400 session 0x5595d73e0d20
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70352896 unmapped: 16515072 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 83 heartbeat osd_stat(store_statfs(0x4fba50000/0x0/0x4ffc00000, data 0x113e468/0x11cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9b9c000 session 0x5595d739d0e0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9b9c400 session 0x5595d6de4d20
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9598c00 session 0x5595d7b16b40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 679530 data_alloc: 218103808 data_used: 172032
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70336512 unmapped: 16531456 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.183552742s of 10.257410049s, submitted: 36
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 16498688 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9598000 session 0x5595d7b165a0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 83 heartbeat osd_stat(store_statfs(0x4fba4e000/0x0/0x4ffc00000, data 0x113fa80/0x11d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 16490496 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 83 handle_osd_map epochs [84,84], i have 83, src has [1,84]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 84 ms_handle_reset con 0x5595d9598400 session 0x5595d72c52c0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 16482304 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 84 ms_handle_reset con 0x5595d7ac9400 session 0x5595d7b170e0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 84 ms_handle_reset con 0x5595d8136c00 session 0x5595d73d9860
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 16482304 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 680355 data_alloc: 218103808 data_used: 192512
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 16482304 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 84 handle_osd_map epochs [84,85], i have 84, src has [1,85]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 84 heartbeat osd_stat(store_statfs(0x4fba4c000/0x0/0x4ffc00000, data 0x1141058/0x11d1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 16465920 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 85 handle_osd_map epochs [86,86], i have 85, src has [1,86]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d9b9c000 session 0x5595d8074b40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 16465920 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 86 heartbeat osd_stat(store_statfs(0x4fba45000/0x0/0x4ffc00000, data 0x1143b22/0x11d7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 16465920 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 16433152 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d8137400 session 0x5595d73d8780
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d9b9e800 session 0x5595d8196960
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 687015 data_alloc: 218103808 data_used: 192512
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d7ac9400 session 0x5595d73d9a40
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 16433152 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 86 handle_osd_map epochs [86,87], i have 86, src has [1,87]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 16498688 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.584367752s of 11.068427086s, submitted: 87
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 87 ms_handle_reset con 0x5595d8136c00 session 0x5595d80634a0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 87 heartbeat osd_stat(store_statfs(0x4fba43000/0x0/0x4ffc00000, data 0x1144fde/0x11da000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70418432 unmapped: 16449536 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 87 handle_osd_map epochs [87,88], i have 87, src has [1,88]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 88 ms_handle_reset con 0x5595d9598000 session 0x5595d7bae1e0
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 88 heartbeat osd_stat(store_statfs(0x4fba41000/0x0/0x4ffc00000, data 0x11465ba/0x11dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 690967 data_alloc: 218103808 data_used: 196608
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 88 heartbeat osd_stat(store_statfs(0x4fba41000/0x0/0x4ffc00000, data 0x11465ba/0x11dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 693939 data_alloc: 218103808 data_used: 196608
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 89 handle_osd_map epochs [89,90], i have 89, src has [1,90]
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.918384552s of 11.027014732s, submitted: 76
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3e000/0x0/0x4ffc00000, data 0x1147a76/0x11df000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70508544 unmapped: 16359424 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: do_command 'config diff' '{prefix=config diff}'
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: do_command 'config show' '{prefix=config show}'
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 16056320 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: do_command 'counter dump' '{prefix=counter dump}'
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: do_command 'counter schema' '{prefix=counter schema}'
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 15605760 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 15720448 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:28 np0005540741 ceph-osd[90166]: do_command 'log dump' '{prefix=log dump}'
Dec  1 04:39:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Dec  1 04:39:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3916327048' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  1 04:39:28 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14768 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:39:28 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Dec  1 04:39:28 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2405876524' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  1 04:39:28 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14772 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:39:29 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14776 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2668326975' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #42. Immutable memtables: 0.
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.228024) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 42
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581969228060, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 617, "num_deletes": 251, "total_data_size": 445464, "memory_usage": 456960, "flush_reason": "Manual Compaction"}
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #43: started
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581969235153, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 43, "file_size": 439195, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17564, "largest_seqno": 18180, "table_properties": {"data_size": 435845, "index_size": 1194, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7966, "raw_average_key_size": 19, "raw_value_size": 429149, "raw_average_value_size": 1046, "num_data_blocks": 55, "num_entries": 410, "num_filter_entries": 410, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764581926, "oldest_key_time": 1764581926, "file_creation_time": 1764581969, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 43, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 7476 microseconds, and 3258 cpu microseconds.
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.235490) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #43: 439195 bytes OK
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.235627) [db/memtable_list.cc:519] [default] Level-0 commit table #43 started
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.237381) [db/memtable_list.cc:722] [default] Level-0 commit table #43: memtable #1 done
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.237460) EVENT_LOG_v1 {"time_micros": 1764581969237393, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.237487) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 442046, prev total WAL file size 442046, number of live WAL files 2.
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000039.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.238072) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [43(428KB)], [41(5337KB)]
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581969238132, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [43], "files_L6": [41], "score": -1, "input_data_size": 5905070, "oldest_snapshot_seqno": -1}
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #44: 3772 keys, 4740877 bytes, temperature: kUnknown
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581969274725, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 44, "file_size": 4740877, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 4715809, "index_size": 14527, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9477, "raw_key_size": 89955, "raw_average_key_size": 23, "raw_value_size": 4648025, "raw_average_value_size": 1232, "num_data_blocks": 622, "num_entries": 3772, "num_filter_entries": 3772, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764581969, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.274951) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 4740877 bytes
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.276318) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 161.1 rd, 129.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 5.2 +0.0 blob) out(4.5 +0.0 blob), read-write-amplify(24.2) write-amplify(10.8) OK, records in: 4281, records dropped: 509 output_compression: NoCompression
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.276334) EVENT_LOG_v1 {"time_micros": 1764581969276326, "job": 20, "event": "compaction_finished", "compaction_time_micros": 36664, "compaction_time_cpu_micros": 12635, "output_level": 6, "num_output_files": 1, "total_output_size": 4740877, "num_input_records": 4281, "num_output_records": 3772, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000043.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581969276503, "job": 20, "event": "table_file_deletion", "file_number": 43}
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581969277491, "job": 20, "event": "table_file_deletion", "file_number": 41}
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.237953) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.277559) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.277566) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.277568) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.277570) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.277573) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:39:29 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 5f0fb47a-a6a2-4329-86d7-5fba4d7a0a45 does not exist
Dec  1 04:39:29 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 97aff81c-14fd-4ab5-bff5-f0f260a84c58 does not exist
Dec  1 04:39:29 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 8bf75225-2d95-490c-b709-2f43b018763e does not exist
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:39:29 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14778 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Dec  1 04:39:29 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1834446799' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec  1 04:39:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v869: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:30 np0005540741 podman[260846]: 2025-12-01 09:39:30.033679988 +0000 UTC m=+0.049410923 container create f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_sutherland, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec  1 04:39:30 np0005540741 systemd[1]: Started libpod-conmon-f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e.scope.
Dec  1 04:39:30 np0005540741 podman[260846]: 2025-12-01 09:39:30.010839121 +0000 UTC m=+0.026570086 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:39:30 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:39:30 np0005540741 podman[260846]: 2025-12-01 09:39:30.135965121 +0000 UTC m=+0.151696086 container init f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_sutherland, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:39:30 np0005540741 podman[260846]: 2025-12-01 09:39:30.145199547 +0000 UTC m=+0.160930482 container start f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_sutherland, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:39:30 np0005540741 podman[260846]: 2025-12-01 09:39:30.150777988 +0000 UTC m=+0.166508923 container attach f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_sutherland, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True)
Dec  1 04:39:30 np0005540741 wonderful_sutherland[260877]: 167 167
Dec  1 04:39:30 np0005540741 systemd[1]: libpod-f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e.scope: Deactivated successfully.
Dec  1 04:39:30 np0005540741 podman[260846]: 2025-12-01 09:39:30.152000513 +0000 UTC m=+0.167731448 container died f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_sutherland, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:39:30 np0005540741 systemd[1]: var-lib-containers-storage-overlay-d9069dcc5b9a23f5b88a6637c41b1d627b12407dd0614ed311ca22345ad7c86f-merged.mount: Deactivated successfully.
Dec  1 04:39:30 np0005540741 podman[260846]: 2025-12-01 09:39:30.198730377 +0000 UTC m=+0.214461332 container remove f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_sutherland, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec  1 04:39:30 np0005540741 systemd[1]: libpod-conmon-f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e.scope: Deactivated successfully.
Dec  1 04:39:30 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:39:30 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:39:30 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:39:30 np0005540741 podman[260959]: 2025-12-01 09:39:30.382776163 +0000 UTC m=+0.055996072 container create 6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_rhodes, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:39:30 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14786 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec  1 04:39:30 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:39:30.396+0000 7fd2d6503640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec  1 04:39:30 np0005540741 ceph-mgr[75324]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec  1 04:39:30 np0005540741 systemd[1]: Started libpod-conmon-6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd.scope.
Dec  1 04:39:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "node ls"} v 0) v1
Dec  1 04:39:30 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1795086165' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec  1 04:39:30 np0005540741 podman[260959]: 2025-12-01 09:39:30.362011066 +0000 UTC m=+0.035231005 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:39:30 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:39:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/359e8c412c40f675ba46edf6017c90c3b5f808ca55f5817c181335d5126d0cab/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:39:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/359e8c412c40f675ba46edf6017c90c3b5f808ca55f5817c181335d5126d0cab/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:39:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/359e8c412c40f675ba46edf6017c90c3b5f808ca55f5817c181335d5126d0cab/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:39:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/359e8c412c40f675ba46edf6017c90c3b5f808ca55f5817c181335d5126d0cab/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:39:30 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/359e8c412c40f675ba46edf6017c90c3b5f808ca55f5817c181335d5126d0cab/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:39:30 np0005540741 podman[260959]: 2025-12-01 09:39:30.496006382 +0000 UTC m=+0.169226311 container init 6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_rhodes, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:39:30 np0005540741 podman[260959]: 2025-12-01 09:39:30.505579217 +0000 UTC m=+0.178799156 container start 6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_rhodes, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:39:30 np0005540741 podman[260959]: 2025-12-01 09:39:30.509823159 +0000 UTC m=+0.183043098 container attach 6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_rhodes, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec  1 04:39:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:39:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Dec  1 04:39:30 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/506962092' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec  1 04:39:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Dec  1 04:39:30 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/276962131' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec  1 04:39:31 np0005540741 nova_compute[250706]: 2025-12-01 09:39:31.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:39:31 np0005540741 nova_compute[250706]: 2025-12-01 09:39:31.058 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:39:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Dec  1 04:39:31 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/80551184' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec  1 04:39:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Dec  1 04:39:31 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3354255277' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec  1 04:39:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Dec  1 04:39:31 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3228762746' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec  1 04:39:31 np0005540741 eloquent_rhodes[260993]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:39:31 np0005540741 eloquent_rhodes[260993]: --> relative data size: 1.0
Dec  1 04:39:31 np0005540741 eloquent_rhodes[260993]: --> All data devices are unavailable
Dec  1 04:39:31 np0005540741 systemd[1]: libpod-6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd.scope: Deactivated successfully.
Dec  1 04:39:31 np0005540741 systemd[1]: libpod-6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd.scope: Consumed 1.093s CPU time.
Dec  1 04:39:31 np0005540741 podman[260959]: 2025-12-01 09:39:31.694280532 +0000 UTC m=+1.367500451 container died 6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_rhodes, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:39:31 np0005540741 systemd[1]: var-lib-containers-storage-overlay-359e8c412c40f675ba46edf6017c90c3b5f808ca55f5817c181335d5126d0cab-merged.mount: Deactivated successfully.
Dec  1 04:39:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Dec  1 04:39:31 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2676379662' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec  1 04:39:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v870: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:39:31 np0005540741 podman[260959]: 2025-12-01 09:39:31.751835898 +0000 UTC m=+1.425055817 container remove 6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_rhodes, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 04:39:31 np0005540741 systemd[1]: libpod-conmon-6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd.scope: Deactivated successfully.
Dec  1 04:39:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Dec  1 04:39:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1721211012' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec  1 04:39:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Dec  1 04:39:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/895605000' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec  1 04:39:32 np0005540741 podman[261473]: 2025-12-01 09:39:32.424382991 +0000 UTC m=+0.050577137 container create cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_grothendieck, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef)
Dec  1 04:39:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Dec  1 04:39:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/356448226' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec  1 04:39:32 np0005540741 systemd[1]: Started libpod-conmon-cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329.scope.
Dec  1 04:39:32 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:39:32 np0005540741 podman[261473]: 2025-12-01 09:39:32.397475736 +0000 UTC m=+0.023669902 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:39:32 np0005540741 podman[261473]: 2025-12-01 09:39:32.508495351 +0000 UTC m=+0.134689517 container init cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_grothendieck, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  1 04:39:32 np0005540741 podman[261473]: 2025-12-01 09:39:32.516084489 +0000 UTC m=+0.142278635 container start cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_grothendieck, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec  1 04:39:32 np0005540741 podman[261473]: 2025-12-01 09:39:32.519386365 +0000 UTC m=+0.145580531 container attach cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_grothendieck, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:39:32 np0005540741 systemd[1]: libpod-cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329.scope: Deactivated successfully.
Dec  1 04:39:32 np0005540741 quizzical_grothendieck[261491]: 167 167
Dec  1 04:39:32 np0005540741 conmon[261491]: conmon cf4e6739ea1eba3ffcf1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329.scope/container/memory.events
Dec  1 04:39:32 np0005540741 podman[261473]: 2025-12-01 09:39:32.525361456 +0000 UTC m=+0.151555602 container died cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_grothendieck, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec  1 04:39:32 np0005540741 systemd[1]: var-lib-containers-storage-overlay-ed3e3a11e3c3af09fd8c8fc1a658f1b132ac95d5d23fbde8cd297b01876b0c3c-merged.mount: Deactivated successfully.
Dec  1 04:39:32 np0005540741 podman[261473]: 2025-12-01 09:39:32.568697123 +0000 UTC m=+0.194891269 container remove cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_grothendieck, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Dec  1 04:39:32 np0005540741 systemd[1]: libpod-conmon-cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329.scope: Deactivated successfully.
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000027 1 0.000023
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000010 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000009
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000036 1 0.000030
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000030 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000013 1 0.000016
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000031 1 0.000024
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000012 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000027 1 0.000022
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000016 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000049 1 0.000029
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000011 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000013 1 0.000016
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000040 1 0.000029
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000011 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000027 1 0.000023
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000012 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000009
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000054 1 0.000511
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000047 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000018
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000079 1 0.000030
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000032 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000020
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000109 1 0.000037
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000105 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000020 1 0.000039
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000151 1 0.000055
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000031 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000014
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000010 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000053 1 0.000036
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000024 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000014
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000040 1 0.000052
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000022 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000008
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000053 1 0.000027
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.001060 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000024
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000076 1 0.000057
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000042 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000017
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000100 1 0.000035
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000032 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000016
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000069 1 0.000034
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000034 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000016 1 0.000024
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000046 1 0.000030
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b(unlocked)] enter Initial
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000080 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000022
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000124 1 0.000086
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.333077 1 0.000029
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.336865 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.893488 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.893512 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1c] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666628838s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502311707s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1c] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666577339s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502311707s@ mbc={}] exit Reset 0.000108 1 0.000146
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666577339s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502311707s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666577339s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502311707s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666577339s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502311707s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666577339s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502311707s@ mbc={}] exit Start 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666577339s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502311707s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.011075 2 0.000030
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.954357 15 0.000117
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.964609 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.964710 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.964750 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045631409s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881553650s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045609474s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881553650s@ mbc={}] exit Reset 0.000053 1 0.000056
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045609474s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881553650s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045609474s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881553650s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045609474s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881553650s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045609474s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881553650s@ mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1694800268' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045609474s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881553650s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.953814 15 0.001072
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.964654 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.964818 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.964864 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045528412s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881576538s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045514107s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881576538s@ mbc={}] exit Reset 0.000026 1 0.000045
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045514107s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881576538s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045514107s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881576538s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045514107s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881576538s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045514107s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881576538s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045514107s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881576538s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.333638 1 0.000031
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.337307 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.895408 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.895424 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.13] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666149139s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502319336s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.13] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666124344s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502319336s@ mbc={}] exit Reset 0.000054 1 0.000076
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666124344s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502319336s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666124344s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502319336s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666124344s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502319336s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666124344s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502319336s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666124344s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502319336s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.954771 15 0.000146
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.965101 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.965172 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.965197 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045070648s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881378174s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045049667s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881378174s@ mbc={}] exit Reset 0.000039 1 0.000059
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045049667s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881378174s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045049667s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881378174s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045049667s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881378174s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045049667s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881378174s@ mbc={}] exit Start 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045049667s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881378174s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.954879 15 0.000087
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.965189 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.965337 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.965369 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045023918s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881462097s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045001030s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881462097s@ mbc={}] exit Reset 0.000039 1 0.000060
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045001030s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881462097s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045001030s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881462097s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045001030s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881462097s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045001030s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881462097s@ mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045001030s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881462097s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.334004 1 0.000027
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.337607 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.900750 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.900795 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.11] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665835381s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502403259s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.11] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665820122s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502403259s@ mbc={}] exit Reset 0.000029 1 0.000046
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665820122s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502403259s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665820122s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502403259s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665820122s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502403259s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665820122s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502403259s@ mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665820122s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502403259s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.955301 15 0.000096
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.965713 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.965818 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.965864 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044493675s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881225586s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044479370s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881225586s@ mbc={}] exit Reset 0.000026 1 0.000052
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044479370s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881225586s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044479370s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881225586s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044479370s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881225586s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044479370s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881225586s@ mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044479370s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881225586s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.955515 15 0.000087
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.965890 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.966026 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.966048 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044302940s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881187439s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044287682s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881187439s@ mbc={}] exit Reset 0.000027 1 0.000051
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044287682s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881187439s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044287682s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881187439s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044287682s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881187439s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044287682s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881187439s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044287682s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881187439s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.334066 1 0.000031
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.337831 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.895928 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.895942 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.15] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665797234s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502769470s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.15] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665776253s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502769470s@ mbc={}] exit Reset 0.000031 1 0.000043
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665776253s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502769470s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665776253s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502769470s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665776253s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502769470s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665776253s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502769470s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665776253s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502769470s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.955592 15 0.000119
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.966157 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.966265 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.966291 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043943405s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881034851s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043929100s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] exit Reset 0.000039 1 0.000039
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043929100s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043929100s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043929100s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043929100s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043929100s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.956014 15 0.000236
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.966411 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.966570 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.966614 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043830872s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881057739s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043811798s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881057739s@ mbc={}] exit Reset 0.000048 1 0.000071
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043811798s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881057739s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043811798s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881057739s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043811798s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881057739s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043811798s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881057739s@ mbc={}] exit Start 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043811798s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881057739s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.327971 1 0.000040
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.336448 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.898669 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.898693 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.a] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671725273s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.509086609s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.a] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671697617s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509086609s@ mbc={}] exit Reset 0.000046 1 0.000081
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671697617s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509086609s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671697617s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509086609s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671697617s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509086609s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671697617s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509086609s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671697617s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509086609s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.328889 1 0.000177
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.337779 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.899010 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.899026 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.9] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670864105s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508338928s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.9] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670849800s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508338928s@ mbc={}] exit Reset 0.000024 1 0.000039
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670849800s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508338928s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670849800s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508338928s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670849800s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508338928s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670849800s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508338928s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670849800s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508338928s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.956459 15 0.000061
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.966799 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.966979 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.967012 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043417931s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880981445s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043404579s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880981445s@ mbc={}] exit Reset 0.000022 1 0.000045
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043404579s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880981445s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043404579s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880981445s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043404579s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880981445s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043404579s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880981445s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043404579s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880981445s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.329534 1 0.000047
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338024 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.898587 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.898614 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.8] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670290947s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508003235s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.8] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670258522s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508003235s@ mbc={}] exit Reset 0.000047 1 0.000069
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670258522s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508003235s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670258522s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508003235s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670258522s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508003235s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670258522s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508003235s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670258522s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508003235s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.329537 1 0.000042
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338056 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.897494 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.897510 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670177460s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508010864s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670162201s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508010864s@ mbc={}] exit Reset 0.000025 1 0.000041
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670162201s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508010864s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670162201s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508010864s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670162201s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508010864s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670162201s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508010864s@ mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670162201s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508010864s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.329808 1 0.000029
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338125 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.899730 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.899751 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.6] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670031548s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508018494s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.6] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670013428s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508018494s@ mbc={}] exit Reset 0.000034 1 0.000050
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670013428s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508018494s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670013428s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508018494s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670013428s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508018494s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670013428s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508018494s@ mbc={}] exit Start 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670013428s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508018494s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.957691 15 0.000066
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.968302 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.968368 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.968397 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.329858 1 0.000034
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338282 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.900450 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.957690 15 0.000073
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.900470 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.968317 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.968435 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.968476 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042176247s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880577087s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.5] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669865608s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508308411s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042231560s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880676270s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042157173s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] exit Reset 0.000096 1 0.000117
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042157173s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042157173s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042157173s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.5] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042157173s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] exit Start 0.000011 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042157173s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042071342s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880577087s@ mbc={}] exit Reset 0.000208 1 0.000232
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669753075s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508308411s@ mbc={}] exit Reset 0.000141 1 0.000180
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042071342s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880577087s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669753075s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508308411s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042071342s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880577087s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042071342s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880577087s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042071342s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880577087s@ mbc={}] exit Start 0.000017 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042071342s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880577087s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669753075s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508308411s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669753075s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508308411s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669753075s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508308411s@ mbc={}] exit Start 0.000033 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669753075s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508308411s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.957996 15 0.000137
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.968799 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.968855 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.968878 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041958809s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880676270s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014513 2 0.000057
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041932106s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] exit Reset 0.000047 1 0.000068
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041932106s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041932106s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041932106s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041932106s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] exit Start 0.000017 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041932106s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330342 1 0.000043
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338888 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014302 2 0.000050
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.901479 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.015063 2 0.000778
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.901542 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000008 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000008 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.4] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669230461s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508064270s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.4] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000009 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013925 2 0.000026
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013346 2 0.000029
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012307 2 0.000060
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330203 1 0.000028
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669204712s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508064270s@ mbc={}] exit Reset 0.000047 1 0.000078
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338716 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.901442 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669204712s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508064270s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.958032 15 0.000286
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.901463 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.968537 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.969404 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330235 1 0.000029
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669622421s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508628845s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338727 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.901333 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.969436 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.901355 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669596672s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508628845s@ mbc={}] exit Reset 0.000054 1 0.000090
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669596672s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508628845s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669596672s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508628845s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669596672s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508628845s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669596672s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508628845s@ mbc={}] exit Start 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041983604s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881034851s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669596672s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508628845s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.2] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669599533s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508674622s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041935921s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] exit Reset 0.000070 1 0.000125
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041935921s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041935921s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041935921s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.958809 15 0.000086
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041935921s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.2] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041935921s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669544220s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508674622s@ mbc={}] exit Reset 0.000102 1 0.000132
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.969376 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669544220s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508674622s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669544220s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508674622s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.969565 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669544220s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508674622s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669544220s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508674622s@ mbc={}] exit Start 0.000012 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669544220s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508674622s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.969600 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041315079s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880500793s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.329673 1 0.000034
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338879 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.901864 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041291237s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880500793s@ mbc={}] exit Reset 0.000040 1 0.000096
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.901927 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041291237s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880500793s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669204712s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508064270s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041291237s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880500793s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669204712s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508064270s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041291237s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880500793s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669204712s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508064270s@ mbc={}] exit Start 0.000279 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.3] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.958986 15 0.000109
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669204712s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508064270s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041291237s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880500793s@ mbc={}] exit Start 0.000010 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041291237s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880500793s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.969766 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.969863 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669652939s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508903503s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.969889 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.3] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669572830s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508903503s@ mbc={}] exit Reset 0.000105 1 0.000138
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041206360s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880538940s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669572830s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508903503s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669572830s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508903503s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669572830s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508903503s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330237 1 0.000025
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041181564s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880538940s@ mbc={}] exit Reset 0.000043 1 0.000105
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041181564s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880538940s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.339037 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041181564s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880538940s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.899276 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041181564s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880538940s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.958880 15 0.000081
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.899296 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.969565 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041181564s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880538940s@ mbc={}] exit Start 0.000014 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669572830s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508903503s@ mbc={}] exit Start 0.000011 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.970869 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041181564s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880538940s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.c] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669507027s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508911133s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669572830s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508903503s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.c] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669469833s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508911133s@ mbc={}] exit Reset 0.000055 1 0.000085
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.970999 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669469833s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508911133s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669469833s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508911133s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669469833s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508911133s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669469833s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508911133s@ mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669469833s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508911133s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.959140 15 0.000112
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.970904 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041090965s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880569458s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.971056 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.971078 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330365 1 0.000029
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041068077s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880569458s@ mbc={}] exit Reset 0.000048 1 0.000123
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.339081 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041068077s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880569458s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040925980s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880439758s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041068077s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880569458s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041068077s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880569458s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041068077s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880569458s@ mbc={}] exit Start 0.000017 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.899977 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041068077s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880569458s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.900019 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040884972s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880439758s@ mbc={}] exit Reset 0.000057 1 0.000079
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040884972s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880439758s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.e] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330234 1 0.000027
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669351578s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508926392s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040884972s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880439758s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040884972s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880439758s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.e] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040884972s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880439758s@ mbc={}] exit Start 0.000028 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040884972s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880439758s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669313431s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508926392s@ mbc={}] exit Reset 0.000061 1 0.000111
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669313431s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508926392s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669313431s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508926392s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.959342 15 0.000107
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.970249 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.971157 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.971188 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.965321 15 0.000228
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.971493 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.971661 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040772438s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880477905s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.971721 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034724236s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.874450684s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034696579s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.874450684s@ mbc={}] exit Reset 0.000045 1 0.000075
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034696579s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.874450684s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034696579s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.874450684s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034696579s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.874450684s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034696579s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.874450684s@ mbc={}] exit Start 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034696579s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.874450684s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.339090 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.897997 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.898079 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330639 1 0.000025
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.339277 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.902705 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330664 1 0.000028
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.902723 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.339328 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.903190 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040751457s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880477905s@ mbc={}] exit Reset 0.000193 1 0.000133
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.903212 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1a] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040751457s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880477905s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508941650s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669108391s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508995056s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040751457s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880477905s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040751457s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880477905s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.18] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040751457s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880477905s@ mbc={}] exit Start 0.000013 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669086456s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508987427s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1a] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040751457s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880477905s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.18] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508987427s@ mbc={}] exit Reset 0.000055 1 0.000084
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508987427s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669073105s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508995056s@ mbc={}] exit Reset 0.000077 1 0.000074
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508987427s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669003487s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508941650s@ mbc={}] exit Reset 0.000098 1 0.000378
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508987427s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669073105s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508995056s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669003487s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508941650s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508987427s@ mbc={}] exit Start 0.000012 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669073105s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508995056s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669003487s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508941650s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508987427s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669073105s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508995056s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669003487s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508941650s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669073105s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508995056s@ mbc={}] exit Start 0.000008 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669073105s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508995056s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669003487s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508941650s@ mbc={}] exit Start 0.000016 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.959197 15 0.000122
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669003487s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508941650s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.969834 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.971494 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.971529 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040740967s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880767822s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330792 1 0.000020
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.339419 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.898241 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.898258 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.959204 15 0.000141
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1b] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.970161 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668955803s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.509048462s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.971619 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040719986s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880767822s@ mbc={}] exit Reset 0.000039 1 0.000060
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1b] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.971664 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040719986s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880767822s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040719986s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880767822s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668935776s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509048462s@ mbc={}] exit Reset 0.000043 1 0.000072
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040719986s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880767822s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668935776s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509048462s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040719986s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880767822s@ mbc={}] exit Start 0.000010 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668935776s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509048462s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040763855s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880889893s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040719986s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880767822s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.015009 2 0.000057
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668935776s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509048462s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668935776s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509048462s@ mbc={}] exit Start 0.000044 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668935776s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509048462s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040732384s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880889893s@ mbc={}] exit Reset 0.000049 1 0.000074
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040732384s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880889893s@ mbc={}] enter Started
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040732384s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880889893s@ mbc={}] enter Start
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040732384s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880889893s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040732384s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880889893s@ mbc={}] exit Start 0.000012 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014392 2 0.000028
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040732384s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880889893s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012731 2 0.000073
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014022 2 0.000033
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012398 2 0.000052
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014881 2 0.000053
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014371 2 0.000025
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.011153 2 0.000024
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012262 2 0.000021
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.010870 2 0.000018
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012012 2 0.000029
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009982 2 0.000018
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009896 2 0.000018
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000017 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008784 2 0.000038
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008223 2 0.000057
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007847 2 0.000025
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.010783 2 0.000020
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007707 2 0.000025
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.010714 2 0.000019
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000098 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000010 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014230 2 0.000022
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014021 2 0.000022
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014260 2 0.000029
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013855 2 0.000019
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013658 2 0.000020
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000283 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013183 2 0.000017
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013962 2 0.000019
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013746 2 0.000020
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013384 2 0.000019
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013155 2 0.000020
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012304 2 0.000030
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012126 2 0.000032
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.011098 2 0.000035
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000018 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009258 2 0.000040
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.016495 2 0.000091
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000009 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.13] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.13] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013586 2 0.000061
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012867 2 0.000043
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1c] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1c] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.11] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.15] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.15] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.11] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.a] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.8] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.a] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.5] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.8] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.5] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.2] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.2] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.c] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.c] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1a] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1a] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.017104 2 0.000029
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.9] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.9] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.6] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.6] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.4] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.3] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.017538 2 0.000026
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669313431s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508926392s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.3] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669313431s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508926392s@ mbc={}] exit Start 0.013001 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669313431s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508926392s@ mbc={}] enter Started/Stray
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.4] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.e] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.e] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.18] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.18] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1b] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.024493 2 0.000019
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1b] failed. State was: unregistering
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.b scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.b scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 57532416 unmapped: 1130496 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 46 handle_osd_map epochs [46,47], i have 46, src has [1,47]
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 46 handle_osd_map epochs [46,47], i have 47, src has [1,47]
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.070264 2 0.000059
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.087918 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.074607 2 0.000059
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.074779 2 0.000064
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.088533 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.079415 2 0.000033
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.088787 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.088019 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.082553 2 0.000192
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.090501 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.082692 2 0.000130
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.090722 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.079742 2 0.000041
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.090942 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.082972 2 0.000043
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.091391 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.083023 2 0.000032
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.091966 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.071673 2 0.000045
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.088892 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.080154 2 0.000088
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.092600 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.080032 2 0.000226
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.092473 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.083345 2 0.000047
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.093309 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.083405 2 0.000056
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.093477 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.080412 2 0.000117
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.093653 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.080525 2 0.000327
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.080687 2 0.000021
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.093921 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.093981 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.083495 2 0.000119
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.094320 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.083625 2 0.000053
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.094474 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.080803 2 0.000378
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.094653 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.083918 2 0.000038
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.094851 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.080940 2 0.000393
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.094970 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.081468 2 0.000023
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.095188 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.084151 2 0.000036
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.095386 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.081640 2 0.000024
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.095587 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.071260 2 0.000044
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.081870 2 0.000036
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.096232 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.095865 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.084357 2 0.000034
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.096436 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.082350 2 0.000045
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.096669 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.084573 2 0.000036
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.096937 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.084719 2 0.000054
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.097331 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.084784 2 0.000071
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.097686 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.081413 2 0.000029
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.098126 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.086222 2 0.000668
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.098636 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.081406 2 0.000529
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.096467 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.084893 2 0.000030
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.099135 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.084913 2 0.000036
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.099411 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 handle_osd_map epochs [47,47], i have 47, src has [1,47]
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.086486 2 0.000032
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.099929 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.085118 2 0.000108
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.085236 2 0.000033
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.100194 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.099733 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.085368 2 0.000042
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.100497 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.086695 2 0.000052
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.089745 2 0.000037
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.100720 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.100957 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.086880 2 0.000051
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.101305 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.086923 2 0.000174
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.101788 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.087048 2 0.000100
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.102453 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 handle_osd_map epochs [47,47], i have 47, src has [1,47]
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008505 4 0.000175
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008404 4 0.000117
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.009155 4 0.000128
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008966 4 0.000063
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000028 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008922 4 0.000055
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008989 4 0.000083
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008995 4 0.000525
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008882 4 0.000066
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008878 4 0.000106
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017619 4 0.000097
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017680 4 0.000120
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017891 4 0.000069
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017855 4 0.000070
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017893 4 0.000046
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018147 4 0.000277
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018366 4 0.000054
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018436 4 0.000196
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018391 4 0.000092
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018429 4 0.000051
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000084 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018413 4 0.000047
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018363 4 0.000036
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018378 4 0.000056
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018303 4 0.000058
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018263 4 0.000067
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018230 4 0.000092
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018093 4 0.000056
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018037 4 0.000049
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018149 4 0.000148
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017985 4 0.000185
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017939 4 0.000058
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017909 4 0.000063
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017906 4 0.000056
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017840 4 0.000501
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017847 4 0.000066
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000014 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017869 4 0.000511
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017800 4 0.000164
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017795 4 0.000080
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017707 4 0.000059
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017761 4 0.000069
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017697 4 0.000059
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000011 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017723 4 0.000083
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017671 4 0.000054
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017598 4 0.000052
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017516 4 0.000068
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000015 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017918 4 0.000281
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017917 4 0.000341
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.113420 7 0.000043
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.113356 7 0.000060
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.113800 7 0.000062
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000088 1 0.000052
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000178 1 0.000051
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.15] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.114318 7 0.000080
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000468 1 0.000021
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.11] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000327 1 0.000071
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.118674 7 0.000045
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.114637 7 0.000103
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.114641 7 0.000079
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000125 1 0.000047
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000166 1 0.000016
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1b] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.005130 1 0.005049
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.124248 7 0.000078
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.124554 7 0.000111
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000077 1 0.000059
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000107 1 0.000040
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1c] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.124956 7 0.000087
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.125468 7 0.000066
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.124210 7 0.000120
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.123791 7 0.000065
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.123704 7 0.000083
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.123920 7 0.000272
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.124155 7 0.000083
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.123598 7 0.000077
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.123556 7 0.000074
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.123251 7 0.000069
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.110379 7 0.013105
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.125429 7 0.000073
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.123108 7 0.000081
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000361 1 0.000202
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.122937 7 0.000130
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.8] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000376 1 0.000125
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.5] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000662 1 0.000231
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000768 1 0.000014
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.2] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001154 1 0.000028
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001254 1 0.000094
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001368 1 0.000059
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001476 1 0.000026
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.e] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001548 1 0.000031
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.002443 1 0.000119
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.002537 1 0.000292
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.a] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.002627 1 0.000081
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1a] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.002753 1 0.000146
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.c] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.002723 1 0.000120
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.130748 7 0.000059
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.130511 7 0.000069
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.130263 7 0.000061
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.127536 7 0.000069
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.127486 7 0.000084
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.127987 7 0.000075
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.128565 7 0.000067
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.129102 7 0.000055
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.127676 7 0.000146
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.129646 7 0.000068
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.128672 7 0.000135
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.128033 7 0.000436
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.127411 7 0.000252
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.130213 7 0.000086
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.127365 7 0.000105
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.129504 7 0.000066
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.127418 7 0.000072
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000422 1 0.000037
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.13] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000522 1 0.000059
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.129684 7 0.000063
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000788 1 0.000038
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000828 1 0.000016
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000938 1 0.000013
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001012 1 0.000014
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000961 1 0.000109
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001007 1 0.000020
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001042 1 0.000023
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001094 1 0.000186
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.6] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001141 1 0.000024
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.4] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001165 1 0.000019
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001203 1 0.000018
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001309 1 0.000025
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1f] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001376 1 0.000024
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.f] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001293 1 0.000478
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.3] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001525 1 0.000240
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.18] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001384 1 0.000482
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.9] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.021316 1 0.000060
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.021454 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.134917 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.021510 1 0.000134
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.021721 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.135117 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.027524 1 0.000078
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.028039 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.141868 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.15] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.11] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.034883 1 0.000090
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.035246 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.149597 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.038378 1 0.000046
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.038561 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.157272 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.045626 1 0.000022
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.045866 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.160570 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1b] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.047894 1 0.000119
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.053103 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.167786 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.055158 1 0.000058
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.055282 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.179571 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.062318 1 0.001171
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.062469 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.187077 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1c] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.067269 1 0.000211
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.067682 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.192698 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.8] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.074391 1 0.000042
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.075107 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.200637 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.081736 1 0.000052
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.082157 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.206530 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.5] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.088612 1 0.000054
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.089437 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.213283 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.2] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.095651 1 0.000061
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.096879 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.220540 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.103684 1 0.000055
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.105003 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.229260 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.110314 1 0.000127
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.111749 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.235527 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.117562 1 0.000044
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.119111 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.242569 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.e] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.124828 1 0.000141
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.126513 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.249820 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.131583 1 0.000086
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.134087 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.258131 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.138570 1 0.000056
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.141168 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.266869 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.a] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.145940 1 0.000083
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.148599 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.271749 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1a] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.153311 1 0.000103
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.156105 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.279691 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.c] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.160571 1 0.000052
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.163406 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.286415 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.166513 1 0.000135
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.166969 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.297749 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.13] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.173744 1 0.000147
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.174316 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.304892 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.180823 1 0.000053
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.181660 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.311995 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.188294 1 0.000097
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.189204 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.316778 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.195677 1 0.000045
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.196672 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.324206 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.202718 1 0.000033
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.203764 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.331781 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.210207 1 0.000054
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.211228 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.339924 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.217437 1 0.000051
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.218482 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.348164 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.224968 1 0.000039
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.226045 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.354762 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.232345 1 0.000033
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.233491 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.362776 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.6] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.239801 1 0.000034
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.240988 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.369333 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.4] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.246939 1 0.000030
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.248139 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.375591 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.254142 1 0.000059
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.255384 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.385620 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.261572 1 0.000106
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.262919 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.390330 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1f] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.269302 1 0.000037
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.270734 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.400265 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.f] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.276355 1 0.000092
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.277710 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.405904 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.3] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.283604 1 0.000051
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.285182 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.412663 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.18] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.291106 1 0.000039
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.292587 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.422772 0 0.000000
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.9] failed. State was: not registered w/ OSD
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 57868288 unmapped: 1843200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e6000/0x0/0x4ffc00000, data 0x9dc1e/0xe6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e6000/0x0/0x4ffc00000, data 0x9dc1e/0xe6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 57901056 unmapped: 1810432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 352071 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 57917440 unmapped: 1794048 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e6000/0x0/0x4ffc00000, data 0x9dc1e/0xe6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 57974784 unmapped: 1736704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 1679360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 1679360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e5000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.d deep-scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.d deep-scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 1679360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 361942 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 1671168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 1671168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 1695744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 1695744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.520864487s of 12.925769806s, submitted: 415
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 1695744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 363090 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 1695744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 1695744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 1671168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 1662976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58056704 unmapped: 1654784 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 364238 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58064896 unmapped: 1646592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58064896 unmapped: 1646592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 1679360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 1671168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.19 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.918129921s of 10.043713570s, submitted: 6
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 1671168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 365386 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 1662976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 1662976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58073088 unmapped: 1638400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 1630208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58073088 unmapped: 1638400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 367682 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58073088 unmapped: 1638400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 1630208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.7 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.7 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58097664 unmapped: 1613824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 1605632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 1605632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 368829 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 1597440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.b scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.829093933s of 11.850773811s, submitted: 6
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.b scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58163200 unmapped: 1548288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 1540096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 1540096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 1540096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 369976 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 1531904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.d scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.d scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 1531904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 1531904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58195968 unmapped: 1515520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.10 deep-scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.10 deep-scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58204160 unmapped: 1507328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 372271 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58212352 unmapped: 1499136 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58212352 unmapped: 1499136 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58220544 unmapped: 1490944 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 1482752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 1482752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.12 deep-scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.794986725s of 13.843894005s, submitted: 6
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.12 deep-scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 373419 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 1482752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58245120 unmapped: 1466368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58253312 unmapped: 1458176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58253312 unmapped: 1458176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58253312 unmapped: 1458176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 375715 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 1441792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 1441792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 1417216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 1417216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 1417216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 375715 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58302464 unmapped: 1409024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.17 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.927809715s of 10.960511208s, submitted: 6
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.17 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 1400832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.19 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.19 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58318848 unmapped: 1392640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58318848 unmapped: 1392640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58335232 unmapped: 1376256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1e scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1e scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 380307 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58351616 unmapped: 1359872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58351616 unmapped: 1359872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 1351680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1e scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1e scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 1351680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 1343488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 381455 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 1343488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.706548691s of 10.881001472s, submitted: 10
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58376192 unmapped: 1335296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58376192 unmapped: 1335296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58384384 unmapped: 1327104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 1318912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 383751 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 1318912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 1318912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 1310720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58417152 unmapped: 1294336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 386047 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 1269760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 1269760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 1261568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 386047 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 1261568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.f deep-scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.945797920s of 14.976054192s, submitted: 8
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.f deep-scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 1261568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58458112 unmapped: 1253376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 1245184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 1236992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 388341 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.d scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.d scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 1236992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.c scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.c scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58482688 unmapped: 1228800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 1220608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 1220608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58499072 unmapped: 1212416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 391782 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.c deep-scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.c deep-scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 1204224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.d scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.d scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 1204224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1 deep-scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.828042984s of 10.924718857s, submitted: 14
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1 deep-scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 1187840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.f scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.f scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 1179648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.5 deep-scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.5 deep-scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 1171456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 398664 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 1163264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 1155072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 1138688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 1138688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 1130496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 398664 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 1130496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58589184 unmapped: 1122304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58597376 unmapped: 1114112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.3 deep-scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.857633591s of 11.003772736s, submitted: 8
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.3 deep-scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 1105920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 1105920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 399811 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 1105920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 1097728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 1089536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 1089536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 1081344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 400958 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 1081344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 1073152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 1064960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 1056768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 1056768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 402105 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.16 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.033282280s of 12.157509804s, submitted: 6
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.16 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58687488 unmapped: 1024000 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 404401 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.12 deep-scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.12 deep-scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 405549 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58736640 unmapped: 974848 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58736640 unmapped: 974848 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.857577324s of 11.936425209s, submitted: 6
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58761216 unmapped: 950272 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58761216 unmapped: 950272 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 407845 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.a scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.a scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 408992 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.d scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.989212036s of 10.056298256s, submitted: 6
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.d scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58834944 unmapped: 876544 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411286 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 414727 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.b scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.990326881s of 10.023887634s, submitted: 10
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.b scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58875904 unmapped: 835584 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58892288 unmapped: 819200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 417021 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.e scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.e scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.8 deep-scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.8 deep-scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 420462 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.916434288s of 10.038483620s, submitted: 12
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 423906 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 427349 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.1b deep-scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.910740852s of 10.036123276s, submitted: 10
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.1b deep-scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 428497 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1d scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1d scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1c scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1c scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 430793 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59170816 unmapped: 540672 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 483328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 483328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59236352 unmapped: 475136 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59244544 unmapped: 466944 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59244544 unmapped: 466944 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59252736 unmapped: 458752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59252736 unmapped: 458752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59260928 unmapped: 450560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59260928 unmapped: 450560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59260928 unmapped: 450560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59531264 unmapped: 180224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59531264 unmapped: 180224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59686912 unmapped: 24576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59686912 unmapped: 24576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 8192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 8192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 0 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 0 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59736064 unmapped: 1024000 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59736064 unmapped: 1024000 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59744256 unmapped: 1015808 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59744256 unmapped: 1015808 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59801600 unmapped: 958464 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59801600 unmapped: 958464 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59826176 unmapped: 933888 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59826176 unmapped: 933888 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59850752 unmapped: 909312 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59850752 unmapped: 909312 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59867136 unmapped: 892928 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59867136 unmapped: 892928 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59891712 unmapped: 868352 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59891712 unmapped: 868352 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59908096 unmapped: 851968 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59908096 unmapped: 851968 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59916288 unmapped: 843776 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59916288 unmapped: 843776 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59940864 unmapped: 819200 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59940864 unmapped: 819200 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59949056 unmapped: 811008 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59949056 unmapped: 811008 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59981824 unmapped: 778240 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59981824 unmapped: 778240 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60039168 unmapped: 720896 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60039168 unmapped: 720896 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60055552 unmapped: 704512 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60055552 unmapped: 704512 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60071936 unmapped: 688128 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60071936 unmapped: 688128 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60088320 unmapped: 671744 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60088320 unmapped: 671744 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60121088 unmapped: 638976 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60121088 unmapped: 638976 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60137472 unmapped: 622592 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60137472 unmapped: 622592 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60153856 unmapped: 606208 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60153856 unmapped: 606208 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60178432 unmapped: 581632 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60178432 unmapped: 581632 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60186624 unmapped: 573440 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60186624 unmapped: 573440 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60194816 unmapped: 565248 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60194816 unmapped: 565248 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60194816 unmapped: 565248 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60211200 unmapped: 548864 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60211200 unmapped: 548864 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 podman[261541]: 2025-12-01 09:39:32.756955801 +0000 UTC m=+0.055629512 container create d47e3746fc9187e59d95a88b5eac2c80c66dce35c0de6ea7bb2f386d9da1aea3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_yalow, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60227584 unmapped: 532480 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60227584 unmapped: 532480 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60235776 unmapped: 524288 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60235776 unmapped: 524288 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60268544 unmapped: 491520 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60268544 unmapped: 491520 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60276736 unmapped: 483328 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60276736 unmapped: 483328 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60284928 unmapped: 475136 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60284928 unmapped: 475136 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60284928 unmapped: 475136 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60293120 unmapped: 466944 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60293120 unmapped: 466944 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60301312 unmapped: 458752 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60301312 unmapped: 458752 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60301312 unmapped: 458752 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60309504 unmapped: 450560 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60309504 unmapped: 450560 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60309504 unmapped: 450560 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60317696 unmapped: 442368 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60317696 unmapped: 442368 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.2 total, 600.0 interval#012Cumulative writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 16.04 MB, 0.03 MB/s#012Interval WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60391424 unmapped: 368640 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60399616 unmapped: 360448 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60399616 unmapped: 360448 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60416000 unmapped: 344064 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60416000 unmapped: 344064 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60432384 unmapped: 327680 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60440576 unmapped: 319488 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60440576 unmapped: 319488 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60448768 unmapped: 311296 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60448768 unmapped: 311296 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60456960 unmapped: 303104 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60456960 unmapped: 303104 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60473344 unmapped: 286720 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60473344 unmapped: 286720 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60489728 unmapped: 270336 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60489728 unmapped: 270336 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60514304 unmapped: 245760 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60514304 unmapped: 245760 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60530688 unmapped: 229376 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60530688 unmapped: 229376 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60547072 unmapped: 212992 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60547072 unmapped: 212992 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60563456 unmapped: 196608 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60563456 unmapped: 196608 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60571648 unmapped: 188416 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60571648 unmapped: 188416 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60604416 unmapped: 155648 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60604416 unmapped: 155648 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60628992 unmapped: 131072 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60628992 unmapped: 131072 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60645376 unmapped: 114688 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60645376 unmapped: 114688 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60768256 unmapped: 1040384 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60768256 unmapped: 1040384 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60768256 unmapped: 1040384 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60776448 unmapped: 1032192 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60776448 unmapped: 1032192 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60776448 unmapped: 1032192 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 ms_handle_reset con 0x555f1b271c00 session 0x555f19f23860
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 ms_handle_reset con 0x555f1b3fe800 session 0x555f1b28fc20
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 systemd[1]: Started libpod-conmon-d47e3746fc9187e59d95a88b5eac2c80c66dce35c0de6ea7bb2f386d9da1aea3.scope.
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 podman[261541]: 2025-12-01 09:39:32.732488067 +0000 UTC m=+0.031161808 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:39:32 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:46:50 np0005540741 podman[275204]: 2025-12-01 09:46:50.979701341 +0000 UTC m=+0.081540564 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  1 04:46:51 np0005540741 rsyslogd[1007]: imjournal: 15766 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec  1 04:46:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1090: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:46:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1091: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:46:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:46:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1092: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #51. Immutable memtables: 0.
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.063059) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 51
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582416063108, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 2053, "num_deletes": 251, "total_data_size": 2360806, "memory_usage": 2416272, "flush_reason": "Manual Compaction"}
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #52: started
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582416084674, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 52, "file_size": 2278801, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20558, "largest_seqno": 22610, "table_properties": {"data_size": 2269563, "index_size": 5796, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 18529, "raw_average_key_size": 19, "raw_value_size": 2251041, "raw_average_value_size": 2423, "num_data_blocks": 266, "num_entries": 929, "num_filter_entries": 929, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582189, "oldest_key_time": 1764582189, "file_creation_time": 1764582416, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 52, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 21699 microseconds, and 11371 cpu microseconds.
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.084751) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #52: 2278801 bytes OK
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.084787) [db/memtable_list.cc:519] [default] Level-0 commit table #52 started
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.086843) [db/memtable_list.cc:722] [default] Level-0 commit table #52: memtable #1 done
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.086867) EVENT_LOG_v1 {"time_micros": 1764582416086859, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.086891) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2352208, prev total WAL file size 2352208, number of live WAL files 2.
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000048.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.088609) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [52(2225KB)], [50(5587KB)]
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582416088698, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [52], "files_L6": [50], "score": -1, "input_data_size": 8000126, "oldest_snapshot_seqno": -1}
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #53: 4423 keys, 6764644 bytes, temperature: kUnknown
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582416147635, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 53, "file_size": 6764644, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 6732018, "index_size": 20484, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11077, "raw_key_size": 106061, "raw_average_key_size": 23, "raw_value_size": 6649479, "raw_average_value_size": 1503, "num_data_blocks": 873, "num_entries": 4423, "num_filter_entries": 4423, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764582416, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.147990) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 6764644 bytes
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.149860) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 135.5 rd, 114.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 5.5 +0.0 blob) out(6.5 +0.0 blob), read-write-amplify(6.5) write-amplify(3.0) OK, records in: 4937, records dropped: 514 output_compression: NoCompression
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.149890) EVENT_LOG_v1 {"time_micros": 1764582416149876, "job": 26, "event": "compaction_finished", "compaction_time_micros": 59036, "compaction_time_cpu_micros": 34716, "output_level": 6, "num_output_files": 1, "total_output_size": 6764644, "num_input_records": 4937, "num_output_records": 4423, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000052.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582416150966, "job": 26, "event": "table_file_deletion", "file_number": 52}
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582416153379, "job": 26, "event": "table_file_deletion", "file_number": 50}
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.088340) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.153519) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.153529) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.153533) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.153536) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:46:56 np0005540741 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.153540) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:46:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1093: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:46:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1094: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:00 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:47:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1095: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1096: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:05 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:47:05 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1097: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:06 np0005540741 podman[275223]: 2025-12-01 09:47:06.991248429 +0000 UTC m=+0.084018516 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:47:07 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1098: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:09 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1099: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:10 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:47:11 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1100: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:47:13
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'images', 'volumes', 'cephfs.cephfs.meta', 'vms', 'cephfs.cephfs.data', '.mgr']
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec  1 04:47:13 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1101: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:15 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:47:15 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1102: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:17 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1103: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:18 np0005540741 podman[275243]: 2025-12-01 09:47:18.01250418 +0000 UTC m=+0.112590166 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Dec  1 04:47:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec  1 04:47:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:47:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec  1 04:47:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:47:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec  1 04:47:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:47:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec  1 04:47:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:47:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:47:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:47:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec  1 04:47:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:47:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec  1 04:47:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec  1 04:47:18 np0005540741 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec  1 04:47:19 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1104: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:47:20.485 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:47:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:47:20.485 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:47:20 np0005540741 ovn_metadata_agent[159893]: 2025-12-01 09:47:20.485 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:47:20 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:47:21 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1105: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:21 np0005540741 podman[275271]: 2025-12-01 09:47:21.957589972 +0000 UTC m=+0.059808620 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, tcib_managed=true)
Dec  1 04:47:23 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1106: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:25 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:47:25 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1107: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:27 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1108: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:29 np0005540741 podman[275462]: 2025-12-01 09:47:29.911500786 +0000 UTC m=+0.068412787 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec  1 04:47:29 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1109: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:30 np0005540741 podman[275462]: 2025-12-01 09:47:30.028766776 +0000 UTC m=+0.185678767 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec  1 04:47:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:47:30 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:47:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:47:30 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:47:30 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:47:31 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 442cf54a-cc02-4e02-a683-f372c98f4f1f does not exist
Dec  1 04:47:31 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 827f365c-a929-46fa-825d-62f09dd2d050 does not exist
Dec  1 04:47:31 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev 227a8cc2-5b7b-468f-9b7f-732a29910564 does not exist
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:47:31 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:47:31 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1110: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:32 np0005540741 podman[275874]: 2025-12-01 09:47:32.271722083 +0000 UTC m=+0.049964327 container create 8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_joliot, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec  1 04:47:32 np0005540741 systemd[1]: Started libpod-conmon-8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd.scope.
Dec  1 04:47:32 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:47:32 np0005540741 podman[275874]: 2025-12-01 09:47:32.248874026 +0000 UTC m=+0.027116300 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:47:32 np0005540741 podman[275874]: 2025-12-01 09:47:32.351434373 +0000 UTC m=+0.129676637 container init 8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_joliot, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:47:32 np0005540741 podman[275874]: 2025-12-01 09:47:32.3593061 +0000 UTC m=+0.137548334 container start 8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_joliot, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:47:32 np0005540741 podman[275874]: 2025-12-01 09:47:32.363029517 +0000 UTC m=+0.141271781 container attach 8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_joliot, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:47:32 np0005540741 optimistic_joliot[275890]: 167 167
Dec  1 04:47:32 np0005540741 systemd[1]: libpod-8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd.scope: Deactivated successfully.
Dec  1 04:47:32 np0005540741 conmon[275890]: conmon 8d145a496174f845254a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd.scope/container/memory.events
Dec  1 04:47:32 np0005540741 podman[275874]: 2025-12-01 09:47:32.366666701 +0000 UTC m=+0.144908945 container died 8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_joliot, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec  1 04:47:32 np0005540741 systemd[1]: var-lib-containers-storage-overlay-19c18688748a935448676d260a50cbfc3d66bcefbe7df20c926aea50f1fe4a60-merged.mount: Deactivated successfully.
Dec  1 04:47:32 np0005540741 podman[275874]: 2025-12-01 09:47:32.404658333 +0000 UTC m=+0.182900577 container remove 8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_joliot, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:47:32 np0005540741 systemd[1]: libpod-conmon-8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd.scope: Deactivated successfully.
Dec  1 04:47:32 np0005540741 podman[275915]: 2025-12-01 09:47:32.603670872 +0000 UTC m=+0.052706176 container create 20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_bose, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:47:32 np0005540741 systemd[1]: Started libpod-conmon-20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285.scope.
Dec  1 04:47:32 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:47:32 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e9e4f7ae612cc66ca0c7dbadd4d969a66bd3f07b93bb7968a61560e070b0124/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:47:32 np0005540741 podman[275915]: 2025-12-01 09:47:32.585787198 +0000 UTC m=+0.034822522 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:47:32 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e9e4f7ae612cc66ca0c7dbadd4d969a66bd3f07b93bb7968a61560e070b0124/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:47:32 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e9e4f7ae612cc66ca0c7dbadd4d969a66bd3f07b93bb7968a61560e070b0124/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:47:32 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e9e4f7ae612cc66ca0c7dbadd4d969a66bd3f07b93bb7968a61560e070b0124/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:47:32 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e9e4f7ae612cc66ca0c7dbadd4d969a66bd3f07b93bb7968a61560e070b0124/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:47:32 np0005540741 podman[275915]: 2025-12-01 09:47:32.694401779 +0000 UTC m=+0.143437103 container init 20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_bose, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:47:32 np0005540741 podman[275915]: 2025-12-01 09:47:32.701522204 +0000 UTC m=+0.150557508 container start 20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_bose, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec  1 04:47:32 np0005540741 podman[275915]: 2025-12-01 09:47:32.804574705 +0000 UTC m=+0.253610009 container attach 20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_bose, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:47:33 np0005540741 nova_compute[250706]: 2025-12-01 09:47:33.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:47:33 np0005540741 nova_compute[250706]: 2025-12-01 09:47:33.054 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  1 04:47:33 np0005540741 youthful_bose[275931]: --> passed data devices: 0 physical, 3 LVM
Dec  1 04:47:33 np0005540741 youthful_bose[275931]: --> relative data size: 1.0
Dec  1 04:47:33 np0005540741 youthful_bose[275931]: --> All data devices are unavailable
Dec  1 04:47:33 np0005540741 systemd[1]: libpod-20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285.scope: Deactivated successfully.
Dec  1 04:47:33 np0005540741 podman[275915]: 2025-12-01 09:47:33.908108489 +0000 UTC m=+1.357143803 container died 20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_bose, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef)
Dec  1 04:47:33 np0005540741 systemd[1]: libpod-20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285.scope: Consumed 1.157s CPU time.
Dec  1 04:47:33 np0005540741 systemd[1]: var-lib-containers-storage-overlay-0e9e4f7ae612cc66ca0c7dbadd4d969a66bd3f07b93bb7968a61560e070b0124-merged.mount: Deactivated successfully.
Dec  1 04:47:33 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1111: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:33 np0005540741 podman[275915]: 2025-12-01 09:47:33.985564865 +0000 UTC m=+1.434600169 container remove 20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_bose, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:47:33 np0005540741 systemd[1]: libpod-conmon-20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285.scope: Deactivated successfully.
Dec  1 04:47:34 np0005540741 podman[276115]: 2025-12-01 09:47:34.793848272 +0000 UTC m=+0.071127645 container create 2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_mcclintock, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True)
Dec  1 04:47:34 np0005540741 systemd[1]: Started libpod-conmon-2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a.scope.
Dec  1 04:47:34 np0005540741 podman[276115]: 2025-12-01 09:47:34.762846871 +0000 UTC m=+0.040126334 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:47:34 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:47:34 np0005540741 podman[276115]: 2025-12-01 09:47:34.879951586 +0000 UTC m=+0.157230969 container init 2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_mcclintock, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:47:34 np0005540741 podman[276115]: 2025-12-01 09:47:34.892874808 +0000 UTC m=+0.170154221 container start 2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_mcclintock, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True)
Dec  1 04:47:34 np0005540741 podman[276115]: 2025-12-01 09:47:34.897400238 +0000 UTC m=+0.174679661 container attach 2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_mcclintock, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 04:47:34 np0005540741 clever_mcclintock[276132]: 167 167
Dec  1 04:47:34 np0005540741 systemd[1]: libpod-2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a.scope: Deactivated successfully.
Dec  1 04:47:34 np0005540741 conmon[276132]: conmon 2c4125b420df81671027 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a.scope/container/memory.events
Dec  1 04:47:34 np0005540741 podman[276115]: 2025-12-01 09:47:34.901359882 +0000 UTC m=+0.178639285 container died 2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_mcclintock, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec  1 04:47:34 np0005540741 systemd[1]: var-lib-containers-storage-overlay-9890bc63344c8753e91ea8205a958d432304f4c03b1a5d17e93264619b570da0-merged.mount: Deactivated successfully.
Dec  1 04:47:34 np0005540741 podman[276115]: 2025-12-01 09:47:34.951807581 +0000 UTC m=+0.229086974 container remove 2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_mcclintock, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec  1 04:47:34 np0005540741 systemd[1]: libpod-conmon-2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a.scope: Deactivated successfully.
Dec  1 04:47:35 np0005540741 podman[276158]: 2025-12-01 09:47:35.1556617 +0000 UTC m=+0.056107454 container create a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_lehmann, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:47:35 np0005540741 systemd[1]: Started libpod-conmon-a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7.scope.
Dec  1 04:47:35 np0005540741 podman[276158]: 2025-12-01 09:47:35.131130735 +0000 UTC m=+0.031576569 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:47:35 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:47:35 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da46aa3578d6033f89ebc9e27fb2db5a2afa153cef22b88e2f43cc4e5fcfc4db/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:47:35 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da46aa3578d6033f89ebc9e27fb2db5a2afa153cef22b88e2f43cc4e5fcfc4db/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:47:35 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da46aa3578d6033f89ebc9e27fb2db5a2afa153cef22b88e2f43cc4e5fcfc4db/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:47:35 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da46aa3578d6033f89ebc9e27fb2db5a2afa153cef22b88e2f43cc4e5fcfc4db/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:47:35 np0005540741 podman[276158]: 2025-12-01 09:47:35.269025268 +0000 UTC m=+0.169471092 container init a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_lehmann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec  1 04:47:35 np0005540741 podman[276158]: 2025-12-01 09:47:35.286416778 +0000 UTC m=+0.186862532 container start a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_lehmann, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 04:47:35 np0005540741 podman[276158]: 2025-12-01 09:47:35.290524996 +0000 UTC m=+0.190970870 container attach a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_lehmann, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:47:35 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:47:35 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1112: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]: {
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:    "0": [
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:        {
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "devices": [
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "/dev/loop3"
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            ],
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "lv_name": "ceph_lv0",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "lv_size": "21470642176",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "name": "ceph_lv0",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "tags": {
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.cluster_name": "ceph",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.crush_device_class": "",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.encrypted": "0",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.osd_id": "0",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.type": "block",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.vdo": "0"
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            },
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "type": "block",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "vg_name": "ceph_vg0"
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:        }
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:    ],
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:    "1": [
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:        {
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "devices": [
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "/dev/loop4"
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            ],
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "lv_name": "ceph_lv1",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "lv_size": "21470642176",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "name": "ceph_lv1",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "path": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "tags": {
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.cluster_name": "ceph",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.crush_device_class": "",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.encrypted": "0",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.osd_id": "1",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.type": "block",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.vdo": "0"
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            },
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "type": "block",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "vg_name": "ceph_vg1"
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:        }
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:    ],
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:    "2": [
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:        {
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "devices": [
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "/dev/loop5"
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            ],
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "lv_name": "ceph_lv2",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "lv_size": "21470642176",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "name": "ceph_lv2",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "path": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "tags": {
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.cluster_name": "ceph",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.crush_device_class": "",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.encrypted": "0",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.osd_id": "2",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.type": "block",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:                "ceph.vdo": "0"
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            },
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "type": "block",
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:            "vg_name": "ceph_vg2"
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:        }
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]:    ]
Dec  1 04:47:36 np0005540741 peaceful_lehmann[276175]: }
Dec  1 04:47:36 np0005540741 systemd[1]: libpod-a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7.scope: Deactivated successfully.
Dec  1 04:47:36 np0005540741 podman[276158]: 2025-12-01 09:47:36.091888975 +0000 UTC m=+0.992334729 container died a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_lehmann, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec  1 04:47:36 np0005540741 systemd[1]: var-lib-containers-storage-overlay-da46aa3578d6033f89ebc9e27fb2db5a2afa153cef22b88e2f43cc4e5fcfc4db-merged.mount: Deactivated successfully.
Dec  1 04:47:36 np0005540741 podman[276158]: 2025-12-01 09:47:36.147247766 +0000 UTC m=+1.047693530 container remove a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_lehmann, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:47:36 np0005540741 systemd[1]: libpod-conmon-a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7.scope: Deactivated successfully.
Dec  1 04:47:36 np0005540741 podman[276336]: 2025-12-01 09:47:36.913752883 +0000 UTC m=+0.060447868 container create 8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_driscoll, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:47:36 np0005540741 systemd[1]: Started libpod-conmon-8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d.scope.
Dec  1 04:47:36 np0005540741 podman[276336]: 2025-12-01 09:47:36.886411528 +0000 UTC m=+0.033106603 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:47:36 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:47:37 np0005540741 podman[276336]: 2025-12-01 09:47:37.01212088 +0000 UTC m=+0.158815915 container init 8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_driscoll, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec  1 04:47:37 np0005540741 podman[276336]: 2025-12-01 09:47:37.022267572 +0000 UTC m=+0.168962557 container start 8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_driscoll, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507)
Dec  1 04:47:37 np0005540741 podman[276336]: 2025-12-01 09:47:37.027364918 +0000 UTC m=+0.174059993 container attach 8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_driscoll, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec  1 04:47:37 np0005540741 trusting_driscoll[276353]: 167 167
Dec  1 04:47:37 np0005540741 systemd[1]: libpod-8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d.scope: Deactivated successfully.
Dec  1 04:47:37 np0005540741 podman[276336]: 2025-12-01 09:47:37.030635852 +0000 UTC m=+0.177330847 container died 8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_driscoll, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:47:37 np0005540741 systemd[1]: var-lib-containers-storage-overlay-3dc38c1a22ac1b08a7dbb0a7d2d01b2dcf4d8c7211bc7e0635eb962152e43ace-merged.mount: Deactivated successfully.
Dec  1 04:47:37 np0005540741 podman[276336]: 2025-12-01 09:47:37.089209826 +0000 UTC m=+0.235904821 container remove 8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_driscoll, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:47:37 np0005540741 systemd[1]: libpod-conmon-8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d.scope: Deactivated successfully.
Dec  1 04:47:37 np0005540741 podman[276358]: 2025-12-01 09:47:37.149281432 +0000 UTC m=+0.077396485 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  1 04:47:37 np0005540741 podman[276395]: 2025-12-01 09:47:37.309215608 +0000 UTC m=+0.058804011 container create 3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_jackson, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Dec  1 04:47:37 np0005540741 systemd[1]: Started libpod-conmon-3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716.scope.
Dec  1 04:47:37 np0005540741 podman[276395]: 2025-12-01 09:47:37.282743507 +0000 UTC m=+0.032331950 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec  1 04:47:37 np0005540741 systemd[1]: Started libcrun container.
Dec  1 04:47:37 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd75a5b791990d0f684e7959f5318135e59f09956fc88fb02c308bf1fdb26fb3/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:47:37 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd75a5b791990d0f684e7959f5318135e59f09956fc88fb02c308bf1fdb26fb3/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:47:37 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd75a5b791990d0f684e7959f5318135e59f09956fc88fb02c308bf1fdb26fb3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:47:37 np0005540741 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd75a5b791990d0f684e7959f5318135e59f09956fc88fb02c308bf1fdb26fb3/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:47:37 np0005540741 podman[276395]: 2025-12-01 09:47:37.420428824 +0000 UTC m=+0.170017317 container init 3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_jackson, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec  1 04:47:37 np0005540741 podman[276395]: 2025-12-01 09:47:37.432746368 +0000 UTC m=+0.182334801 container start 3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_jackson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:47:37 np0005540741 podman[276395]: 2025-12-01 09:47:37.437163635 +0000 UTC m=+0.186752128 container attach 3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_jackson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec  1 04:47:37 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1113: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:38 np0005540741 strange_jackson[276412]: {
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:    "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:        "osd_id": 0,
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:        "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:        "type": "bluestore"
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:    },
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:    "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:        "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:        "osd_id": 1,
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:        "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:        "type": "bluestore"
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:    },
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:    "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:        "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:        "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:        "osd_id": 2,
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:        "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:        "type": "bluestore"
Dec  1 04:47:38 np0005540741 strange_jackson[276412]:    }
Dec  1 04:47:38 np0005540741 strange_jackson[276412]: }
Dec  1 04:47:38 np0005540741 systemd[1]: libpod-3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716.scope: Deactivated successfully.
Dec  1 04:47:38 np0005540741 systemd[1]: libpod-3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716.scope: Consumed 1.055s CPU time.
Dec  1 04:47:38 np0005540741 podman[276395]: 2025-12-01 09:47:38.480023593 +0000 UTC m=+1.229612006 container died 3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_jackson, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec  1 04:47:38 np0005540741 systemd[1]: var-lib-containers-storage-overlay-fd75a5b791990d0f684e7959f5318135e59f09956fc88fb02c308bf1fdb26fb3-merged.mount: Deactivated successfully.
Dec  1 04:47:38 np0005540741 podman[276395]: 2025-12-01 09:47:38.538127203 +0000 UTC m=+1.287715616 container remove 3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_jackson, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True)
Dec  1 04:47:38 np0005540741 systemd[1]: libpod-conmon-3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716.scope: Deactivated successfully.
Dec  1 04:47:38 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec  1 04:47:38 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:47:38 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec  1 04:47:38 np0005540741 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:47:38 np0005540741 ceph-mgr[75324]: [progress WARNING root] complete: ev cf5f88bc-022c-4c4c-a2c3-177e04ebd4d0 does not exist
Dec  1 04:47:39 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:47:39 np0005540741 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec  1 04:47:39 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1114: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:40 np0005540741 nova_compute[250706]: 2025-12-01 09:47:40.067 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:47:40 np0005540741 nova_compute[250706]: 2025-12-01 09:47:40.068 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:47:40 np0005540741 nova_compute[250706]: 2025-12-01 09:47:40.068 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:47:40 np0005540741 nova_compute[250706]: 2025-12-01 09:47:40.068 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:47:40 np0005540741 nova_compute[250706]: 2025-12-01 09:47:40.069 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:47:40 np0005540741 nova_compute[250706]: 2025-12-01 09:47:40.069 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 04:47:40 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:47:41 np0005540741 nova_compute[250706]: 2025-12-01 09:47:41.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:47:41 np0005540741 nova_compute[250706]: 2025-12-01 09:47:41.054 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 04:47:41 np0005540741 nova_compute[250706]: 2025-12-01 09:47:41.054 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 04:47:41 np0005540741 nova_compute[250706]: 2025-12-01 09:47:41.071 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 04:47:41 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1115: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:43 np0005540741 nova_compute[250706]: 2025-12-01 09:47:43.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:47:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:47:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:47:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:47:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:47:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec  1 04:47:43 np0005540741 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec  1 04:47:43 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1116: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:44 np0005540741 nova_compute[250706]: 2025-12-01 09:47:44.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:47:44 np0005540741 nova_compute[250706]: 2025-12-01 09:47:44.054 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:47:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec  1 04:47:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1514233815' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 04:47:44 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec  1 04:47:44 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1514233815' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 04:47:45 np0005540741 nova_compute[250706]: 2025-12-01 09:47:45.113 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:47:45 np0005540741 nova_compute[250706]: 2025-12-01 09:47:45.114 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  1 04:47:45 np0005540741 nova_compute[250706]: 2025-12-01 09:47:45.136 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  1 04:47:45 np0005540741 systemd-logind[788]: New session 55 of user zuul.
Dec  1 04:47:45 np0005540741 systemd[1]: Started Session 55 of User zuul.
Dec  1 04:47:45 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:47:45 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1117: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:47 np0005540741 nova_compute[250706]: 2025-12-01 09:47:47.075 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 04:47:47 np0005540741 nova_compute[250706]: 2025-12-01 09:47:47.116 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:47:47 np0005540741 nova_compute[250706]: 2025-12-01 09:47:47.116 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:47:47 np0005540741 nova_compute[250706]: 2025-12-01 09:47:47.116 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:47:47 np0005540741 nova_compute[250706]: 2025-12-01 09:47:47.117 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 04:47:47 np0005540741 nova_compute[250706]: 2025-12-01 09:47:47.117 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:47:47 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec  1 04:47:47 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1622289794' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 04:47:47 np0005540741 nova_compute[250706]: 2025-12-01 09:47:47.627 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:47:47 np0005540741 nova_compute[250706]: 2025-12-01 09:47:47.812 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 04:47:47 np0005540741 nova_compute[250706]: 2025-12-01 09:47:47.813 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5128MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 04:47:47 np0005540741 nova_compute[250706]: 2025-12-01 09:47:47.814 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 04:47:47 np0005540741 nova_compute[250706]: 2025-12-01 09:47:47.814 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 04:47:47 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1118: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:48 np0005540741 nova_compute[250706]: 2025-12-01 09:47:48.126 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 04:47:48 np0005540741 nova_compute[250706]: 2025-12-01 09:47:48.127 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 04:47:48 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15014 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:47:48 np0005540741 nova_compute[250706]: 2025-12-01 09:47:48.237 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Refreshing inventories for resource provider 847e3dbe-0f76-4032-a374-8c965945c22f _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  1 04:47:48 np0005540741 nova_compute[250706]: 2025-12-01 09:47:48.364 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Updating ProviderTree inventory for provider 847e3dbe-0f76-4032-a374-8c965945c22f from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  1 04:47:48 np0005540741 nova_compute[250706]: 2025-12-01 09:47:48.365 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Updating inventory in ProviderTree for provider 847e3dbe-0f76-4032-a374-8c965945c22f with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 04:47:48 np0005540741 nova_compute[250706]: 2025-12-01 09:47:48.418 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Refreshing aggregate associations for resource provider 847e3dbe-0f76-4032-a374-8c965945c22f, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  1 04:47:48 np0005540741 nova_compute[250706]: 2025-12-01 09:47:48.457 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Refreshing trait associations for resource provider 847e3dbe-0f76-4032-a374-8c965945c22f, traits: COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE4A,HW_CPU_X86_AESNI,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_F16C,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_MMX,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_BMI2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  1 04:47:48 np0005540741 nova_compute[250706]: 2025-12-01 09:47:48.495 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 04:47:48 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15016 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:47:48 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec  1 04:47:48 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1997129487' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 04:47:48 np0005540741 nova_compute[250706]: 2025-12-01 09:47:48.978 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 04:47:48 np0005540741 nova_compute[250706]: 2025-12-01 09:47:48.984 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 04:47:49 np0005540741 nova_compute[250706]: 2025-12-01 09:47:49.011 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 04:47:49 np0005540741 nova_compute[250706]: 2025-12-01 09:47:49.012 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 04:47:49 np0005540741 nova_compute[250706]: 2025-12-01 09:47:49.012 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 04:47:49 np0005540741 podman[276758]: 2025-12-01 09:47:49.049161653 +0000 UTC m=+0.139774698 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  1 04:47:49 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status"} v 0) v1
Dec  1 04:47:49 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/150026497' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec  1 04:47:49 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1119: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:50 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:47:51 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1120: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:52 np0005540741 ovs-vsctl[276842]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  1 04:47:53 np0005540741 podman[276889]: 2025-12-01 09:47:53.015034561 +0000 UTC m=+0.106229073 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  1 04:47:53 np0005540741 virtqemud[250400]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  1 04:47:53 np0005540741 virtqemud[250400]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  1 04:47:53 np0005540741 virtqemud[250400]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  1 04:47:53 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: cache status {prefix=cache status} (starting...)
Dec  1 04:47:53 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1121: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:54 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: client ls {prefix=client ls} (starting...)
Dec  1 04:47:54 np0005540741 lvm[277184]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:47:54 np0005540741 lvm[277184]: VG ceph_vg0 finished
Dec  1 04:47:54 np0005540741 lvm[277192]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec  1 04:47:54 np0005540741 lvm[277192]: VG ceph_vg2 finished
Dec  1 04:47:54 np0005540741 lvm[277196]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec  1 04:47:54 np0005540741 lvm[277196]: VG ceph_vg1 finished
Dec  1 04:47:54 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15022 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:47:54 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: damage ls {prefix=damage ls} (starting...)
Dec  1 04:47:54 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump loads {prefix=dump loads} (starting...)
Dec  1 04:47:54 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15024 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:47:55 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec  1 04:47:55 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec  1 04:47:55 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec  1 04:47:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "report"} v 0) v1
Dec  1 04:47:55 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1567427368' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec  1 04:47:55 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec  1 04:47:55 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15030 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:47:55 np0005540741 ceph-mgr[75324]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec  1 04:47:55 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:47:55.663+0000 7fd2d6503640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec  1 04:47:55 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec  1 04:47:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec  1 04:47:55 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2474296016' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 04:47:55 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:47:55 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec  1 04:47:55 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1122: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:56 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: ops {prefix=ops} (starting...)
Dec  1 04:47:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config log"} v 0) v1
Dec  1 04:47:56 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3686709795' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec  1 04:47:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Dec  1 04:47:56 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2621865607' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec  1 04:47:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Dec  1 04:47:56 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4178876128' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec  1 04:47:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Dec  1 04:47:56 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3725325385' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  1 04:47:56 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: session ls {prefix=session ls} (starting...)
Dec  1 04:47:56 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15042 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:47:56 np0005540741 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: status {prefix=status} (starting...)
Dec  1 04:47:56 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Dec  1 04:47:56 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1138620584' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  1 04:47:57 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15046 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:47:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Dec  1 04:47:57 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4113522476' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  1 04:47:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "features"} v 0) v1
Dec  1 04:47:57 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/872774579' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec  1 04:47:57 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Dec  1 04:47:57 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/233936803' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  1 04:47:57 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1123: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:47:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Dec  1 04:47:58 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/490956330' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec  1 04:47:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Dec  1 04:47:58 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3674695074' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec  1 04:47:58 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15058 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:47:58 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:47:58.532+0000 7fd2d6503640 -1 mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Dec  1 04:47:58 np0005540741 ceph-mgr[75324]: mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Dec  1 04:47:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Dec  1 04:47:58 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2448350616' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  1 04:47:58 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Dec  1 04:47:58 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/566368364' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec  1 04:47:59 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15064 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:47:59 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Dec  1 04:47:59 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2541752812' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec  1 04:47:59 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15068 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:47:59 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Dec  1 04:47:59 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1199316515' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  1 04:47:59 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15072 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:47:59 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1124: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:48:00 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15075 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:48:00 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Dec  1 04:48:00 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3252037160' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57327616 unmapped: 1335296 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57327616 unmapped: 1335296 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 361309 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57327616 unmapped: 1335296 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57344000 unmapped: 1318912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57344000 unmapped: 1318912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57352192 unmapped: 1310720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57352192 unmapped: 1310720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 363605 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57352192 unmapped: 1310720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57368576 unmapped: 1294336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.f scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.919653893s of 10.155382156s, submitted: 10
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.f scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57368576 unmapped: 1294336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57376768 unmapped: 1286144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57376768 unmapped: 1286144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 365900 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57376768 unmapped: 1286144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57384960 unmapped: 1277952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57384960 unmapped: 1277952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57393152 unmapped: 1269760 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57393152 unmapped: 1269760 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.e scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.e scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 367047 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57401344 unmapped: 1261568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57401344 unmapped: 1261568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57401344 unmapped: 1261568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57409536 unmapped: 1253376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57409536 unmapped: 1253376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 367047 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57417728 unmapped: 1245184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57417728 unmapped: 1245184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57425920 unmapped: 1236992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57434112 unmapped: 1228800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57434112 unmapped: 1228800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 367047 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57442304 unmapped: 1220608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57442304 unmapped: 1220608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57442304 unmapped: 1220608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.a scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.934640884s of 20.948490143s, submitted: 4
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.a scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57450496 unmapped: 1212416 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57458688 unmapped: 1204224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 368194 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57458688 unmapped: 1204224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57466880 unmapped: 1196032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57466880 unmapped: 1196032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57483264 unmapped: 1179648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.14 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.14 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57491456 unmapped: 1171456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 371637 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57499648 unmapped: 1163264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.11 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.11 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57507840 unmapped: 1155072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.327646255s of 11.902298927s, submitted: 10
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 373933 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57532416 unmapped: 1130496 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57548800 unmapped: 1114112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.13 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.13 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57548800 unmapped: 1114112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.15 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.15 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57556992 unmapped: 1105920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.1f deep-scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.1f deep-scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57556992 unmapped: 1105920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 377377 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57524224 unmapped: 1138688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57524224 unmapped: 1138688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57524224 unmapped: 1138688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 378525 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57540608 unmapped: 1122304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57548800 unmapped: 1114112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57565184 unmapped: 1097728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57565184 unmapped: 1097728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57565184 unmapped: 1097728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.905238152s of 15.089574814s, submitted: 10
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 379673 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57589760 unmapped: 1073152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57589760 unmapped: 1073152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57589760 unmapped: 1073152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57597952 unmapped: 1064960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57597952 unmapped: 1064960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 380821 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57606144 unmapped: 1056768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57614336 unmapped: 1048576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57622528 unmapped: 1040384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57622528 unmapped: 1040384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57622528 unmapped: 1040384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 381969 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57630720 unmapped: 1032192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57630720 unmapped: 1032192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57638912 unmapped: 1024000 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57638912 unmapped: 1024000 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.969743729s of 13.996441841s, submitted: 6
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57655296 unmapped: 1007616 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 383117 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57655296 unmapped: 1007616 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57655296 unmapped: 1007616 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57663488 unmapped: 999424 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57671680 unmapped: 991232 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57679872 unmapped: 983040 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 384265 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57688064 unmapped: 974848 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57688064 unmapped: 974848 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57696256 unmapped: 966656 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57696256 unmapped: 966656 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.e scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.e scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57704448 unmapped: 958464 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 386560 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57704448 unmapped: 958464 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57712640 unmapped: 950272 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57720832 unmapped: 942080 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.a scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.829962730s of 13.884933472s, submitted: 8
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.a scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57745408 unmapped: 917504 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57761792 unmapped: 901120 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 387707 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57761792 unmapped: 901120 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57761792 unmapped: 901120 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57769984 unmapped: 892928 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57769984 unmapped: 892928 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57778176 unmapped: 884736 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 388854 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57778176 unmapped: 884736 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57778176 unmapped: 884736 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57786368 unmapped: 876544 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.005264282s of 10.019852638s, submitted: 4
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57794560 unmapped: 868352 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57794560 unmapped: 868352 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 390001 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57802752 unmapped: 860160 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57802752 unmapped: 860160 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.2 deep-scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.2 deep-scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57819136 unmapped: 843776 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57827328 unmapped: 835584 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57843712 unmapped: 819200 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.c deep-scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.c deep-scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 393442 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57843712 unmapped: 819200 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.7 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.7 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57851904 unmapped: 811008 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57851904 unmapped: 811008 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57851904 unmapped: 811008 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57860096 unmapped: 802816 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 394589 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57868288 unmapped: 794624 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57868288 unmapped: 794624 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57876480 unmapped: 786432 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57876480 unmapped: 786432 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1a scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.707541466s of 15.794960022s, submitted: 10
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1a scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57884672 unmapped: 778240 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 395737 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57884672 unmapped: 778240 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57884672 unmapped: 778240 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57892864 unmapped: 770048 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57909248 unmapped: 753664 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57917440 unmapped: 745472 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 398032 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57917440 unmapped: 745472 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57917440 unmapped: 745472 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57925632 unmapped: 737280 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1e scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1e scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57925632 unmapped: 737280 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57925632 unmapped: 737280 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 400327 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57933824 unmapped: 729088 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57942016 unmapped: 720896 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57950208 unmapped: 712704 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.e scrub starts
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.057037354s of 14.124808311s, submitted: 10
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.e scrub ok
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57958400 unmapped: 704512 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57958400 unmapped: 704512 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57966592 unmapped: 696320 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57966592 unmapped: 696320 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57982976 unmapped: 679936 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57982976 unmapped: 679936 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57991168 unmapped: 671744 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57991168 unmapped: 671744 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57991168 unmapped: 671744 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57999360 unmapped: 663552 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57999360 unmapped: 663552 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58007552 unmapped: 655360 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58007552 unmapped: 655360 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 647168 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58023936 unmapped: 638976 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58023936 unmapped: 638976 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 630784 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 630784 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58056704 unmapped: 606208 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58056704 unmapped: 606208 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58064896 unmapped: 598016 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58073088 unmapped: 589824 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 581632 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 581632 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 581632 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58089472 unmapped: 573440 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58089472 unmapped: 573440 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 557056 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 557056 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 557056 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 548864 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 548864 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 548864 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58122240 unmapped: 540672 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58130432 unmapped: 532480 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58138624 unmapped: 524288 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58138624 unmapped: 524288 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58138624 unmapped: 524288 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58146816 unmapped: 516096 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58146816 unmapped: 516096 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58155008 unmapped: 507904 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58163200 unmapped: 499712 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 491520 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 491520 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 491520 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 483328 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 483328 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58187776 unmapped: 475136 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58187776 unmapped: 475136 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58187776 unmapped: 475136 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58195968 unmapped: 466944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58195968 unmapped: 466944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58195968 unmapped: 466944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58204160 unmapped: 458752 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58204160 unmapped: 458752 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58212352 unmapped: 450560 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58212352 unmapped: 450560 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58220544 unmapped: 442368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58220544 unmapped: 442368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58220544 unmapped: 442368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 434176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 434176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 434176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58236928 unmapped: 425984 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58236928 unmapped: 425984 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58245120 unmapped: 417792 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58253312 unmapped: 409600 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58253312 unmapped: 409600 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58261504 unmapped: 401408 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58261504 unmapped: 401408 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 393216 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 393216 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 393216 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58277888 unmapped: 385024 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58286080 unmapped: 376832 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 368640 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 368640 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 368640 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58302464 unmapped: 360448 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58302464 unmapped: 360448 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 352256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 352256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 352256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58318848 unmapped: 344064 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58327040 unmapped: 335872 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58335232 unmapped: 327680 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58335232 unmapped: 327680 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58343424 unmapped: 319488 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58343424 unmapped: 319488 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 303104 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 303104 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 303104 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 294912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 294912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58376192 unmapped: 286720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58384384 unmapped: 278528 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58384384 unmapped: 278528 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 270336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 270336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 270336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 262144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 262144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58408960 unmapped: 253952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58408960 unmapped: 253952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 237568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58433536 unmapped: 229376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58433536 unmapped: 229376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 221184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 221184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 221184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 212992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 212992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 212992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58458112 unmapped: 204800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58458112 unmapped: 204800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 196608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 196608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 196608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 188416 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 188416 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58482688 unmapped: 180224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58482688 unmapped: 180224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 172032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 172032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 155648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 155648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 155648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 114688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 114688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 106496 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 106496 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58564608 unmapped: 98304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58564608 unmapped: 98304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58564608 unmapped: 98304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 90112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 90112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 81920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 81920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 81920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58589184 unmapped: 73728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58597376 unmapped: 65536 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 57344 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 57344 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 57344 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 49152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 49152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 40960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 40960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 32768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 32768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 32768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 24576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 24576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 24576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 16384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 16384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 8192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 8192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 0 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 0 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 0 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58679296 unmapped: 1032192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58679296 unmapped: 1032192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58679296 unmapped: 1032192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58687488 unmapped: 1024000 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58687488 unmapped: 1024000 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58712064 unmapped: 999424 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58712064 unmapped: 999424 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58744832 unmapped: 966656 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58744832 unmapped: 966656 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58753024 unmapped: 958464 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58753024 unmapped: 958464 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58753024 unmapped: 958464 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58769408 unmapped: 942080 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58769408 unmapped: 942080 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58769408 unmapped: 942080 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58777600 unmapped: 933888 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58777600 unmapped: 933888 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58826752 unmapped: 884736 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58826752 unmapped: 884736 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58834944 unmapped: 876544 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58834944 unmapped: 876544 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58875904 unmapped: 835584 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58875904 unmapped: 835584 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58884096 unmapped: 827392 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58884096 unmapped: 827392 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58884096 unmapped: 827392 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58892288 unmapped: 819200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58892288 unmapped: 819200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58900480 unmapped: 811008 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58900480 unmapped: 811008 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58900480 unmapped: 811008 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58916864 unmapped: 794624 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58916864 unmapped: 794624 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58916864 unmapped: 794624 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58941440 unmapped: 770048 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58941440 unmapped: 770048 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58949632 unmapped: 761856 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58949632 unmapped: 761856 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58957824 unmapped: 753664 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59006976 unmapped: 704512 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59006976 unmapped: 704512 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59006976 unmapped: 704512 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59015168 unmapped: 696320 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59015168 unmapped: 696320 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59023360 unmapped: 688128 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59023360 unmapped: 688128 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59080704 unmapped: 630784 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59080704 unmapped: 630784 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59097088 unmapped: 614400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59097088 unmapped: 614400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59138048 unmapped: 573440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59138048 unmapped: 573440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59170816 unmapped: 540672 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.4 total, 600.0 interval#012Cumulative writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 15.87 MB, 0.03 MB/s#012Interval WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.4 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.4 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.4 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59383808 unmapped: 327680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59383808 unmapped: 327680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59383808 unmapped: 327680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59465728 unmapped: 245760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59465728 unmapped: 245760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: mgrc ms_handle_reset ms_handle_reset con 0x5595d67d3c00
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3312476512
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3312476512,v1:192.168.122.100:6801/3312476512]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: mgrc handle_mgr_configure stats_period=5
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15078 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 ms_handle_reset con 0x5595d6fa8c00 session 0x5595d72c4960
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.4 total, 600.0 interval#012Cumulative writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.4 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.4 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.4 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, 
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 47 handle_osd_map epochs [48,48], i have 47, src has [1,48]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 1077.389038086s of 1077.396484375s, submitted: 2
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 48 heartbeat osd_stat(store_statfs(0x4fe14e000/0x0/0x4ffc00000, data 0x2f8bc/0x7f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 48 handle_osd_map epochs [49,49], i have 48, src has [1,49]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 9977856 heap: 70074368 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 49 handle_osd_map epochs [50,50], i have 49, src has [1,50]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 50 ms_handle_reset con 0x5595d7ac9400 session 0x5595d73d83c0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61227008 unmapped: 8847360 heap: 70074368 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 50 heartbeat osd_stat(store_statfs(0x4fd065000/0x0/0x4ffc00000, data 0x11124f6/0x1168000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61538304 unmapped: 16932864 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 50 handle_osd_map epochs [51,51], i have 50, src has [1,51]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 51 ms_handle_reset con 0x5595d8136c00 session 0x5595d6e51a40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 540377 data_alloc: 218103808 data_used: 28672
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 540377 data_alloc: 218103808 data_used: 28672
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 51 handle_osd_map epochs [52,52], i have 51, src has [1,52]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 51 handle_osd_map epochs [52,52], i have 52, src has [1,52]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.469475746s of 11.739644051s, submitted: 54
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 24.064493179s of 24.076759338s, submitted: 13
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f8f/0x116f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 52 handle_osd_map epochs [53,53], i have 52, src has [1,53]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 53 ms_handle_reset con 0x5595d8137000 session 0x5595d8062f00
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61677568 unmapped: 16793600 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 53 ms_handle_reset con 0x5595d8137400 session 0x5595d729f680
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 53 ms_handle_reset con 0x5595d8137800 session 0x5595d81a34a0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61661184 unmapped: 16809984 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 53 heartbeat osd_stat(store_statfs(0x4fd05b000/0x0/0x4ffc00000, data 0x1116549/0x1172000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61661184 unmapped: 16809984 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 53 heartbeat osd_stat(store_statfs(0x4fd05b000/0x0/0x4ffc00000, data 0x1116549/0x1172000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 53 handle_osd_map epochs [53,54], i have 53, src has [1,54]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 53 handle_osd_map epochs [54,54], i have 54, src has [1,54]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 555402 data_alloc: 218103808 data_used: 45056
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 54 ms_handle_reset con 0x5595d7ac9400 session 0x5595d81a3e00
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62808064 unmapped: 15663104 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 54 ms_handle_reset con 0x5595d8137000 session 0x5595d80623c0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 54 ms_handle_reset con 0x5595d8137400 session 0x5595d6e50f00
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62824448 unmapped: 15646720 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 54 heartbeat osd_stat(store_statfs(0x4fd057000/0x0/0x4ffc00000, data 0x1117b13/0x1176000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 54 handle_osd_map epochs [54,55], i have 54, src has [1,55]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 55 ms_handle_reset con 0x5595d8137c00 session 0x5595d6e512c0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62988288 unmapped: 15482880 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62988288 unmapped: 15482880 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 55 handle_osd_map epochs [56,56], i have 55, src has [1,56]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 56 ms_handle_reset con 0x5595d9b9e800 session 0x5595d80921e0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64208896 unmapped: 14262272 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 56 heartbeat osd_stat(store_statfs(0x4fd055000/0x0/0x4ffc00000, data 0x1119111/0x1178000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 559201 data_alloc: 218103808 data_used: 45056
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64241664 unmapped: 14229504 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 56 handle_osd_map epochs [56,57], i have 56, src has [1,57]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 57 handle_osd_map epochs [57,57], i have 57, src has [1,57]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 57 ms_handle_reset con 0x5595d9b9e800 session 0x5595d8075680
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64421888 unmapped: 14049280 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64421888 unmapped: 14049280 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.944048882s of 11.445683479s, submitted: 143
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 57 handle_osd_map epochs [57,58], i have 57, src has [1,58]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 57 handle_osd_map epochs [58,58], i have 58, src has [1,58]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 58 ms_handle_reset con 0x5595d9b9ec00 session 0x5595d729eb40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 58 ms_handle_reset con 0x5595d7ac9400 session 0x5595d80a52c0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 65781760 unmapped: 21086208 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 58 handle_osd_map epochs [58,59], i have 58, src has [1,59]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 58 handle_osd_map epochs [59,59], i have 59, src has [1,59]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 59 ms_handle_reset con 0x5595d8137c00 session 0x5595d7b001e0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 65945600 unmapped: 20922368 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 59 heartbeat osd_stat(store_statfs(0x4fc84b000/0x0/0x4ffc00000, data 0x191d2db/0x1982000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854268 data_alloc: 218103808 data_used: 61440
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 59 handle_osd_map epochs [60,60], i have 59, src has [1,60]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 60 ms_handle_reset con 0x5595d8137400 session 0x5595d7bae780
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 60 ms_handle_reset con 0x5595d8136c00 session 0x5595d80785a0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 66093056 unmapped: 20774912 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 60 handle_osd_map epochs [60,61], i have 60, src has [1,61]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 61 ms_handle_reset con 0x5595d8137400 session 0x5595d73d83c0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 61 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x391fecd/0x398a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 66355200 unmapped: 20512768 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 61 handle_osd_map epochs [62,62], i have 61, src has [1,62]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 62 ms_handle_reset con 0x5595d8137c00 session 0x5595d8197860
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 62 ms_handle_reset con 0x5595d7ac9400 session 0x5595d729e960
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 66609152 unmapped: 20258816 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 62 handle_osd_map epochs [63,63], i have 62, src has [1,63]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 63 ms_handle_reset con 0x5595d8137000 session 0x5595d729f680
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 63 ms_handle_reset con 0x5595d9b9e800 session 0x5595d81963c0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 63 heartbeat osd_stat(store_statfs(0x4fc83a000/0x0/0x4ffc00000, data 0x11237d9/0x1192000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 63 ms_handle_reset con 0x5595d9b9ec00 session 0x5595d80a4b40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 18751488 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 63 handle_osd_map epochs [63,64], i have 63, src has [1,64]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 64 heartbeat osd_stat(store_statfs(0x4fc426000/0x0/0x4ffc00000, data 0x1124dd4/0x1194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 18751488 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 64 handle_osd_map epochs [64,65], i have 64, src has [1,65]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 65 ms_handle_reset con 0x5595d8136c00 session 0x5595d7b16d20
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 65 ms_handle_reset con 0x5595d7ac9400 session 0x5595d8079a40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 65 ms_handle_reset con 0x5595d8137400 session 0x5595d8196b40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 613840 data_alloc: 218103808 data_used: 122880
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 65 heartbeat osd_stat(store_statfs(0x4fc422000/0x0/0x4ffc00000, data 0x11263ba/0x1197000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 18694144 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 65 handle_osd_map epochs [66,66], i have 65, src has [1,66]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 66 ms_handle_reset con 0x5595d8137400 session 0x5595d729f680
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 18743296 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 66 handle_osd_map epochs [66,67], i have 66, src has [1,67]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 66 handle_osd_map epochs [67,67], i have 67, src has [1,67]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 67 ms_handle_reset con 0x5595d7ac9400 session 0x5595d8092b40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68132864 unmapped: 18735104 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 67 handle_osd_map epochs [67,68], i have 67, src has [1,68]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.158089638s of 10.213048935s, submitted: 251
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68386816 unmapped: 18481152 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 68 ms_handle_reset con 0x5595d8136c00 session 0x5595d7b01680
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 68 handle_osd_map epochs [69,69], i have 68, src has [1,69]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 69 ms_handle_reset con 0x5595d9b9e800 session 0x5595d82c2f00
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 69 ms_handle_reset con 0x5595d9b9ec00 session 0x5595d729e3c0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68444160 unmapped: 18423808 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 69 heartbeat osd_stat(store_statfs(0x4fcc19000/0x0/0x4ffc00000, data 0x112ce2f/0x11a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 69 handle_osd_map epochs [70,70], i have 69, src has [1,70]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 630316 data_alloc: 218103808 data_used: 139264
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 70 ms_handle_reset con 0x5595d7ac9400 session 0x5595d729eb40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 70 ms_handle_reset con 0x5595d8136c00 session 0x5595d729fc20
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68624384 unmapped: 18243584 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 70 handle_osd_map epochs [71,71], i have 70, src has [1,71]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d8137400 session 0x5595d8196f00
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d9b9e800 session 0x5595d73e14a0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d8137c00 session 0x5595d73e1680
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 71 heartbeat osd_stat(store_statfs(0x4fcc12000/0x0/0x4ffc00000, data 0x112f364/0x11a9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d7ac9400 session 0x5595d73d83c0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 71 handle_osd_map epochs [72,72], i have 71, src has [1,72]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 634276 data_alloc: 218103808 data_used: 139264
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68771840 unmapped: 18096128 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 72 handle_osd_map epochs [72,73], i have 72, src has [1,73]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 72 handle_osd_map epochs [73,73], i have 73, src has [1,73]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 73 ms_handle_reset con 0x5595d8136c00 session 0x5595d73d8f00
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 73 heartbeat osd_stat(store_statfs(0x4fcc10000/0x0/0x4ffc00000, data 0x1130982/0x11ad000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68771840 unmapped: 18096128 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 73 heartbeat osd_stat(store_statfs(0x4fba6d000/0x0/0x4ffc00000, data 0x113204f/0x11b0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68788224 unmapped: 18079744 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68788224 unmapped: 18079744 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68788224 unmapped: 18079744 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.005791664s of 11.874329567s, submitted: 231
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 73 handle_osd_map epochs [74,74], i have 73, src has [1,74]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 74 ms_handle_reset con 0x5595d9b9c800 session 0x5595d739d860
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 640656 data_alloc: 218103808 data_used: 151552
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68911104 unmapped: 17956864 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 74 handle_osd_map epochs [74,75], i have 74, src has [1,75]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 75 ms_handle_reset con 0x5595d9b9c400 session 0x5595d81974a0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 17948672 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 75 heartbeat osd_stat(store_statfs(0x4fba64000/0x0/0x4ffc00000, data 0x1134c7d/0x11b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 75 ms_handle_reset con 0x5595d9b9c000 session 0x5595d80a4d20
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68952064 unmapped: 17915904 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 75 heartbeat osd_stat(store_statfs(0x4fba64000/0x0/0x4ffc00000, data 0x1134c7d/0x11b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 75 ms_handle_reset con 0x5595d7ac9400 session 0x5595d7d3a780
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 75 handle_osd_map epochs [75,76], i have 75, src has [1,76]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 76 ms_handle_reset con 0x5595d8136c00 session 0x5595d72b8b40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 69074944 unmapped: 17793024 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 76 handle_osd_map epochs [77,77], i have 76, src has [1,77]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 69099520 unmapped: 17768448 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 77 ms_handle_reset con 0x5595d9b9c000 session 0x5595d729e3c0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 654076 data_alloc: 218103808 data_used: 155648
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 77 handle_osd_map epochs [78,78], i have 77, src has [1,78]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 69255168 unmapped: 17612800 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 78 ms_handle_reset con 0x5595d9b9c400 session 0x5595d7b16b40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 78 handle_osd_map epochs [79,79], i have 78, src has [1,79]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 16621568 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 79 heartbeat osd_stat(store_statfs(0x4fba59000/0x0/0x4ffc00000, data 0x113a3c8/0x11c3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 16621568 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 16621568 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c800 session 0x5595d72c52c0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d8136c00 session 0x5595d73d92c0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d7ac9400 session 0x5595d8197680
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c000 session 0x5595d8074b40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c400 session 0x5595d729e960
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d72ad000 session 0x5595d80a5a40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70254592 unmapped: 16613376 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d7ac9400 session 0x5595d80a4960
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d8136c00 session 0x5595d7b01e00
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 665586 data_alloc: 218103808 data_used: 172032
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70254592 unmapped: 16613376 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c000 session 0x5595d81a3a40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 79 handle_osd_map epochs [79,80], i have 79, src has [1,80]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.607955933s of 11.038110733s, submitted: 98
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 80 ms_handle_reset con 0x5595d9b9c400 session 0x5595d7b16b40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 80 ms_handle_reset con 0x5595d7348800 session 0x5595d7b165a0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 80 heartbeat osd_stat(store_statfs(0x4fba59000/0x0/0x4ffc00000, data 0x113a3c8/0x11c3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 16662528 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 16662528 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 80 heartbeat osd_stat(store_statfs(0x4fba57000/0x0/0x4ffc00000, data 0x113b8a0/0x11c6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 16662528 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 16646144 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 667454 data_alloc: 218103808 data_used: 172032
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 16646144 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 80 ms_handle_reset con 0x5595d9b9c000 session 0x5595d7bae3c0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70230016 unmapped: 16637952 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 80 handle_osd_map epochs [81,81], i have 80, src has [1,81]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9b9c400 session 0x5595d7b00780
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598400 session 0x5595d8062f00
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598000 session 0x5595d7b00f00
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598800 session 0x5595d80743c0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70279168 unmapped: 16588800 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598000 session 0x5595d72b90e0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 81 handle_osd_map epochs [82,82], i have 81, src has [1,82]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 82 heartbeat osd_stat(store_statfs(0x4fba54000/0x0/0x4ffc00000, data 0x113ce5a/0x11c9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70328320 unmapped: 16539648 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 82 handle_osd_map epochs [83,83], i have 82, src has [1,83]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9598400 session 0x5595d73e0d20
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70352896 unmapped: 16515072 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 83 heartbeat osd_stat(store_statfs(0x4fba50000/0x0/0x4ffc00000, data 0x113e468/0x11cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9b9c000 session 0x5595d739d0e0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9b9c400 session 0x5595d6de4d20
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9598c00 session 0x5595d7b16b40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 679530 data_alloc: 218103808 data_used: 172032
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70336512 unmapped: 16531456 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.183552742s of 10.257410049s, submitted: 36
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 16498688 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9598000 session 0x5595d7b165a0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 83 heartbeat osd_stat(store_statfs(0x4fba4e000/0x0/0x4ffc00000, data 0x113fa80/0x11d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 16490496 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 83 handle_osd_map epochs [84,84], i have 83, src has [1,84]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 84 ms_handle_reset con 0x5595d9598400 session 0x5595d72c52c0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 16482304 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 84 ms_handle_reset con 0x5595d7ac9400 session 0x5595d7b170e0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 84 ms_handle_reset con 0x5595d8136c00 session 0x5595d73d9860
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 16482304 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 680355 data_alloc: 218103808 data_used: 192512
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 16482304 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 84 handle_osd_map epochs [84,85], i have 84, src has [1,85]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 84 heartbeat osd_stat(store_statfs(0x4fba4c000/0x0/0x4ffc00000, data 0x1141058/0x11d1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 16465920 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 85 handle_osd_map epochs [86,86], i have 85, src has [1,86]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d9b9c000 session 0x5595d8074b40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 16465920 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 86 heartbeat osd_stat(store_statfs(0x4fba45000/0x0/0x4ffc00000, data 0x1143b22/0x11d7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 16465920 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 16433152 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d8137400 session 0x5595d73d8780
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d9b9e800 session 0x5595d8196960
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 687015 data_alloc: 218103808 data_used: 192512
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d7ac9400 session 0x5595d73d9a40
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 16433152 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 86 handle_osd_map epochs [86,87], i have 86, src has [1,87]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 16498688 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.584367752s of 11.068427086s, submitted: 87
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 87 ms_handle_reset con 0x5595d8136c00 session 0x5595d80634a0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 87 heartbeat osd_stat(store_statfs(0x4fba43000/0x0/0x4ffc00000, data 0x1144fde/0x11da000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70418432 unmapped: 16449536 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 87 handle_osd_map epochs [87,88], i have 87, src has [1,88]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 88 ms_handle_reset con 0x5595d9598000 session 0x5595d7bae1e0
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 88 heartbeat osd_stat(store_statfs(0x4fba41000/0x0/0x4ffc00000, data 0x11465ba/0x11dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 690967 data_alloc: 218103808 data_used: 196608
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 88 heartbeat osd_stat(store_statfs(0x4fba41000/0x0/0x4ffc00000, data 0x11465ba/0x11dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 693939 data_alloc: 218103808 data_used: 196608
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 89 handle_osd_map epochs [89,90], i have 89, src has [1,90]
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.918384552s of 11.027014732s, submitted: 76
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3e000/0x0/0x4ffc00000, data 0x1147a76/0x11df000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70508544 unmapped: 16359424 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'config diff' '{prefix=config diff}'
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'config show' '{prefix=config show}'
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 16056320 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'counter dump' '{prefix=counter dump}'
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'counter schema' '{prefix=counter schema}'
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 15605760 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 15720448 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'log dump' '{prefix=log dump}'
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 26763264 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'perf dump' '{prefix=perf dump}'
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'perf schema' '{prefix=perf schema}'
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/361956077' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.4 total, 600.0 interval#012Cumulative writes: 5984 writes, 24K keys, 5984 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5984 writes, 1172 syncs, 5.11 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1822 writes, 4802 keys, 1822 commit groups, 1.0 writes per commit group, ingest: 2.46 MB, 0.00 MB/s#012Interval WAL: 1822 writes, 820 syncs, 2.22 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 ms_handle_reset con 0x5595d9203c00 session 0x5595d6ef6000
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'config diff' '{prefix=config diff}'
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'config show' '{prefix=config show}'
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'counter dump' '{prefix=counter dump}'
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'counter schema' '{prefix=counter schema}'
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 26353664 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 26198016 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec  1 04:48:00 np0005540741 ceph-osd[90166]: do_command 'log dump' '{prefix=log dump}'
Dec  1 04:48:00 np0005540741 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:48:01 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15082 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:48:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Dec  1 04:48:01 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2187297463' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  1 04:48:01 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15086 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec  1 04:48:01 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Dec  1 04:48:01 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3612791590' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  1 04:48:01 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15090 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec  1 04:48:01 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1125: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:48:02 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Dec  1 04:48:02 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1125510605' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec  1 04:48:02 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15094 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec  1 04:48:02 np0005540741 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15100 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec  1 04:48:02 np0005540741 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:48:02.938+0000 7fd2d6503640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec  1 04:48:02 np0005540741 ceph-mgr[75324]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec  1 04:48:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "node ls"} v 0) v1
Dec  1 04:48:03 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1879242237' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec  1 04:48:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Dec  1 04:48:03 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1499303170' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec  1 04:48:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Dec  1 04:48:03 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2037774809' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec  1 04:48:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Dec  1 04:48:03 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4141685720' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec  1 04:48:03 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Dec  1 04:48:03 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/413353018' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec  1 04:48:03 np0005540741 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1126: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec  1 04:48:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Dec  1 04:48:04 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1409976490' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec  1 04:48:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Dec  1 04:48:04 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3574915099' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec  1 04:48:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Dec  1 04:48:04 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1136288052' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec  1 04:48:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Dec  1 04:48:04 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1232343839' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec  1 04:48:04 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Dec  1 04:48:04 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4113344786' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec  1 04:48:05 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Dec  1 04:48:05 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2188931726' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58376192 unmapped: 1335296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58384384 unmapped: 1327104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 1318912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 383751 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 1318912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 1318912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 1310720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58417152 unmapped: 1294336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 386047 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 1269760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 1269760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 1261568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 386047 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 1261568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.f deep-scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.945797920s of 14.976054192s, submitted: 8
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.f deep-scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 1261568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58458112 unmapped: 1253376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 1245184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 1236992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 388341 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.d scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.d scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 1236992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.c scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.c scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58482688 unmapped: 1228800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 1220608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 1220608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58499072 unmapped: 1212416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 391782 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.c deep-scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.c deep-scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 1204224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.d scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.d scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 1204224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1 deep-scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.828042984s of 10.924718857s, submitted: 14
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1 deep-scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 1187840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.f scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.f scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 1179648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.5 deep-scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.5 deep-scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 1171456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 398664 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 1163264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 1155072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 1138688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 1138688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 1130496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 398664 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 1130496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58589184 unmapped: 1122304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58597376 unmapped: 1114112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.3 deep-scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.857633591s of 11.003772736s, submitted: 8
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.3 deep-scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 1105920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 1105920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 399811 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 1105920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 1097728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 1089536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 1089536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 1081344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 400958 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 1081344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 1073152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 1064960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 1056768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 1056768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 402105 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.16 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.033282280s of 12.157509804s, submitted: 6
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.16 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58687488 unmapped: 1024000 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 404401 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.12 deep-scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.12 deep-scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 405549 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58736640 unmapped: 974848 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58736640 unmapped: 974848 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.857577324s of 11.936425209s, submitted: 6
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58761216 unmapped: 950272 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58761216 unmapped: 950272 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 407845 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.a scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.a scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 408992 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.d scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.989212036s of 10.056298256s, submitted: 6
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.d scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58834944 unmapped: 876544 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411286 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 414727 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.b scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.990326881s of 10.023887634s, submitted: 10
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.b scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58875904 unmapped: 835584 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58892288 unmapped: 819200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 417021 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.e scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.e scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.8 deep-scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.8 deep-scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 420462 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.916434288s of 10.038483620s, submitted: 12
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 423906 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 427349 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.1b deep-scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.910740852s of 10.036123276s, submitted: 10
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.1b deep-scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 428497 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1d scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1d scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1c scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1c scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 430793 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59170816 unmapped: 540672 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 483328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 483328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59236352 unmapped: 475136 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59244544 unmapped: 466944 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59244544 unmapped: 466944 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59252736 unmapped: 458752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59252736 unmapped: 458752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59260928 unmapped: 450560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59260928 unmapped: 450560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59260928 unmapped: 450560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59531264 unmapped: 180224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59531264 unmapped: 180224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59686912 unmapped: 24576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59686912 unmapped: 24576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 8192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 8192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 0 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 0 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59736064 unmapped: 1024000 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59736064 unmapped: 1024000 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59744256 unmapped: 1015808 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59744256 unmapped: 1015808 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59801600 unmapped: 958464 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59801600 unmapped: 958464 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59826176 unmapped: 933888 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59826176 unmapped: 933888 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59850752 unmapped: 909312 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59850752 unmapped: 909312 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59867136 unmapped: 892928 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59867136 unmapped: 892928 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59891712 unmapped: 868352 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59891712 unmapped: 868352 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59908096 unmapped: 851968 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59908096 unmapped: 851968 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59916288 unmapped: 843776 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59916288 unmapped: 843776 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59940864 unmapped: 819200 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59940864 unmapped: 819200 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59949056 unmapped: 811008 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59949056 unmapped: 811008 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59981824 unmapped: 778240 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59981824 unmapped: 778240 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60039168 unmapped: 720896 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60039168 unmapped: 720896 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60055552 unmapped: 704512 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60055552 unmapped: 704512 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60071936 unmapped: 688128 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60071936 unmapped: 688128 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60088320 unmapped: 671744 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60088320 unmapped: 671744 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60121088 unmapped: 638976 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60121088 unmapped: 638976 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60137472 unmapped: 622592 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60137472 unmapped: 622592 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60153856 unmapped: 606208 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60153856 unmapped: 606208 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60178432 unmapped: 581632 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60178432 unmapped: 581632 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60186624 unmapped: 573440 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60186624 unmapped: 573440 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60194816 unmapped: 565248 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60194816 unmapped: 565248 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60194816 unmapped: 565248 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60211200 unmapped: 548864 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60211200 unmapped: 548864 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60227584 unmapped: 532480 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60227584 unmapped: 532480 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60235776 unmapped: 524288 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60235776 unmapped: 524288 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60268544 unmapped: 491520 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60268544 unmapped: 491520 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60276736 unmapped: 483328 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60276736 unmapped: 483328 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60284928 unmapped: 475136 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60284928 unmapped: 475136 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60284928 unmapped: 475136 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60293120 unmapped: 466944 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60293120 unmapped: 466944 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60301312 unmapped: 458752 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60301312 unmapped: 458752 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60301312 unmapped: 458752 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60309504 unmapped: 450560 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60309504 unmapped: 450560 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60309504 unmapped: 450560 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60317696 unmapped: 442368 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60317696 unmapped: 442368 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.2 total, 600.0 interval#012Cumulative writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 16.04 MB, 0.03 MB/s#012Interval WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60391424 unmapped: 368640 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60399616 unmapped: 360448 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60399616 unmapped: 360448 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60416000 unmapped: 344064 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60416000 unmapped: 344064 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60432384 unmapped: 327680 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60440576 unmapped: 319488 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60440576 unmapped: 319488 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60448768 unmapped: 311296 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60448768 unmapped: 311296 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60456960 unmapped: 303104 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60456960 unmapped: 303104 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60473344 unmapped: 286720 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60473344 unmapped: 286720 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60489728 unmapped: 270336 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60489728 unmapped: 270336 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60514304 unmapped: 245760 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60514304 unmapped: 245760 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60530688 unmapped: 229376 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60530688 unmapped: 229376 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60547072 unmapped: 212992 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60547072 unmapped: 212992 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60563456 unmapped: 196608 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60563456 unmapped: 196608 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60571648 unmapped: 188416 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60571648 unmapped: 188416 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60604416 unmapped: 155648 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60604416 unmapped: 155648 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60628992 unmapped: 131072 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60628992 unmapped: 131072 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60645376 unmapped: 114688 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60645376 unmapped: 114688 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60768256 unmapped: 1040384 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60768256 unmapped: 1040384 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60768256 unmapped: 1040384 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60776448 unmapped: 1032192 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60776448 unmapped: 1032192 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60776448 unmapped: 1032192 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 ms_handle_reset con 0x555f1b271c00 session 0x555f19f23860
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 ms_handle_reset con 0x555f1b3fe800 session 0x555f1b28fc20
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0) v1
Dec  1 04:48:05 np0005540741 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/840775449' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.2 total, 600.0 interval#012Cumulative writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, in
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 handle_osd_map epochs [47,48], i have 47, src has [1,48]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 1081.428955078s of 1081.454589844s, submitted: 8
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 47 handle_osd_map epochs [48,48], i have 48, src has [1,48]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 48 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 48 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60850176 unmapped: 958464 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 48 handle_osd_map epochs [48,49], i have 48, src has [1,49]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 552245 data_alloc: 218103808 data_used: 24576
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61202432 unmapped: 17391616 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 49 heartbeat osd_stat(store_statfs(0x4fe0e2000/0x0/0x4ffc00000, data 0x9f226/0xeb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 49 handle_osd_map epochs [49,50], i have 49, src has [1,50]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 50 handle_osd_map epochs [50,50], i have 50, src has [1,50]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 50 ms_handle_reset con 0x555f1b858c00 session 0x555f1abae780
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61243392 unmapped: 17350656 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 50 handle_osd_map epochs [51,51], i have 50, src has [1,51]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 51 ms_handle_reset con 0x555f1b858000 session 0x555f1ab3f680
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 51 heartbeat osd_stat(store_statfs(0x4fc8d7000/0x0/0x4ffc00000, data 0x18a33eb/0x18f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 614157 data_alloc: 218103808 data_used: 24576
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 51 heartbeat osd_stat(store_statfs(0x4fc8d7000/0x0/0x4ffc00000, data 0x18a33eb/0x18f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 51 handle_osd_map epochs [52,52], i have 51, src has [1,52]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.322311401s of 10.521731377s, submitted: 31
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 615609 data_alloc: 218103808 data_used: 24576
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 615609 data_alloc: 218103808 data_used: 24576
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 615609 data_alloc: 218103808 data_used: 24576
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 615609 data_alloc: 218103808 data_used: 24576
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 615609 data_alloc: 218103808 data_used: 24576
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 25.412767410s of 25.425762177s, submitted: 13
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 52 handle_osd_map epochs [53,53], i have 52, src has [1,53]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 620555 data_alloc: 218103808 data_used: 28672
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 53 ms_handle_reset con 0x555f1b8eb400 session 0x555f1b28e5a0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61243392 unmapped: 17350656 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 53 heartbeat osd_stat(store_statfs(0x4fc8d2000/0x0/0x4ffc00000, data 0x18a5e55/0x18fb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 53 ms_handle_reset con 0x555f1b8eb000 session 0x555f1aa64960
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 53 ms_handle_reset con 0x555f1cb09000 session 0x555f1aa645a0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61259776 unmapped: 17334272 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 53 heartbeat osd_stat(store_statfs(0x4fc8d2000/0x0/0x4ffc00000, data 0x18a5e55/0x18fb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61259776 unmapped: 17334272 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 53 handle_osd_map epochs [54,54], i have 53, src has [1,54]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 54 ms_handle_reset con 0x555f1cb09000 session 0x555f1aa33a40
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62332928 unmapped: 16261120 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 54 ms_handle_reset con 0x555f1b858000 session 0x555f1aa32d20
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 54 ms_handle_reset con 0x555f1cb08800 session 0x555f1aa32000
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62332928 unmapped: 16261120 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 624199 data_alloc: 218103808 data_used: 28672
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 54 handle_osd_map epochs [55,55], i have 54, src has [1,55]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 55 ms_handle_reset con 0x555f1cb08c00 session 0x555f1aa32780
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61358080 unmapped: 17235968 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61390848 unmapped: 17203200 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 55 heartbeat osd_stat(store_statfs(0x4fc8cc000/0x0/0x4ffc00000, data 0x18a8a1d/0x1901000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 55 handle_osd_map epochs [55,56], i have 55, src has [1,56]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 56 ms_handle_reset con 0x555f1b858c00 session 0x555f1aa32f00
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62455808 unmapped: 16138240 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62455808 unmapped: 16138240 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 56 handle_osd_map epochs [57,57], i have 56, src has [1,57]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 57 ms_handle_reset con 0x555f1cb09000 session 0x555f1aa64780
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62504960 unmapped: 16089088 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 634040 data_alloc: 218103808 data_used: 28672
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62521344 unmapped: 16072704 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.331473351s of 11.440871239s, submitted: 32
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 57 handle_osd_map epochs [58,58], i have 57, src has [1,58]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 58 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab53a40
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 58 ms_handle_reset con 0x555f1cb08c00 session 0x555f1aa652c0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 15613952 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 58 handle_osd_map epochs [59,59], i have 58, src has [1,59]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 59 ms_handle_reset con 0x555f1b858c00 session 0x555f1ab3e3c0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 63291392 unmapped: 23699456 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 59 heartbeat osd_stat(store_statfs(0x4fa8b9000/0x0/0x4ffc00000, data 0x38adc5c/0x3914000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 59 handle_osd_map epochs [60,60], i have 59, src has [1,60]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 60 ms_handle_reset con 0x555f1b8eb000 session 0x555f1a5cde00
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 60 ms_handle_reset con 0x555f1b858000 session 0x555f1c7ac780
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 63479808 unmapped: 23511040 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 60 handle_osd_map epochs [61,61], i have 60, src has [1,61]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 61 ms_handle_reset con 0x555f1b858c00 session 0x555f1ab6e000
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 64602112 unmapped: 22388736 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 61 handle_osd_map epochs [61,62], i have 61, src has [1,62]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 62 ms_handle_reset con 0x555f1cb08c00 session 0x555f1aa610e0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1009042 data_alloc: 218103808 data_used: 45056
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 62 ms_handle_reset con 0x555f1cb08800 session 0x555f1aa5f860
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 65748992 unmapped: 21241856 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 62 handle_osd_map epochs [62,63], i have 62, src has [1,63]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 63 handle_osd_map epochs [63,63], i have 63, src has [1,63]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 63 ms_handle_reset con 0x555f1cb23000 session 0x555f1ab532c0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 63 ms_handle_reset con 0x555f1cb09000 session 0x555f1b29cd20
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 63 ms_handle_reset con 0x555f1b8eb400 session 0x555f1c7ac5a0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 66019328 unmapped: 20971520 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 63 handle_osd_map epochs [64,64], i have 63, src has [1,64]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 66109440 unmapped: 20881408 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 64 handle_osd_map epochs [65,65], i have 64, src has [1,65]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 65 ms_handle_reset con 0x555f1b858c00 session 0x555f1aa5f2c0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 65 ms_handle_reset con 0x555f1b858000 session 0x555f1ab53680
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 65 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab3ef00
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 66191360 unmapped: 20799488 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 65 heartbeat osd_stat(store_statfs(0x4fc89a000/0x0/0x4ffc00000, data 0x18b8e3b/0x1930000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 65 handle_osd_map epochs [66,66], i have 65, src has [1,66]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 66 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab3e780
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 19709952 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 66 handle_osd_map epochs [67,67], i have 66, src has [1,67]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 705756 data_alloc: 218103808 data_used: 65536
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 67 ms_handle_reset con 0x555f1b858000 session 0x555f1a5cd0e0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67362816 unmapped: 19628032 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 67 handle_osd_map epochs [68,68], i have 67, src has [1,68]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.978338242s of 10.211093903s, submitted: 268
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67551232 unmapped: 19439616 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 68 ms_handle_reset con 0x555f1b858c00 session 0x555f1aa610e0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 68 handle_osd_map epochs [68,69], i have 68, src has [1,69]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 69 ms_handle_reset con 0x555f1b8eb400 session 0x555f1c783c20
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 69 ms_handle_reset con 0x555f1cb09000 session 0x555f1aa32b40
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67641344 unmapped: 19349504 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 69 handle_osd_map epochs [69,70], i have 69, src has [1,70]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 70 ms_handle_reset con 0x555f1cb09000 session 0x555f1ab6fc20
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 70 ms_handle_reset con 0x555f1b858000 session 0x555f1c783a40
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67756032 unmapped: 19234816 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 70 handle_osd_map epochs [71,71], i have 70, src has [1,71]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 71 ms_handle_reset con 0x555f1b858c00 session 0x555f1aa64f00
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67805184 unmapped: 19185664 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 71 heartbeat osd_stat(store_statfs(0x4fb2e2000/0x0/0x4ffc00000, data 0x18bf0bd/0x1936000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 712002 data_alloc: 218103808 data_used: 73728
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67846144 unmapped: 19144704 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 71 ms_handle_reset con 0x555f1b8eb400 session 0x555f1ab3f860
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67846144 unmapped: 19144704 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 71 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab3e780
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 19161088 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 71 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab3f4a0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 19161088 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 71 handle_osd_map epochs [73,73], i have 71, src has [1,73]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 71 handle_osd_map epochs [72,73], i have 71, src has [1,73]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 73 ms_handle_reset con 0x555f1b858000 session 0x555f1ab3fc20
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 73 heartbeat osd_stat(store_statfs(0x4fb2bd000/0x0/0x4ffc00000, data 0x18e5d82/0x1960000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69132288 unmapped: 17858560 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 722395 data_alloc: 218103808 data_used: 81920
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69156864 unmapped: 17833984 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 73 heartbeat osd_stat(store_statfs(0x4fb2bd000/0x0/0x4ffc00000, data 0x18e5d82/0x1960000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69156864 unmapped: 17833984 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69156864 unmapped: 17833984 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 73 handle_osd_map epochs [74,74], i have 73, src has [1,74]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.536053658s of 12.151283264s, submitted: 168
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 74 ms_handle_reset con 0x555f1cb09000 session 0x555f1aa614a0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69206016 unmapped: 17784832 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 74 handle_osd_map epochs [75,75], i have 74, src has [1,75]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 75 ms_handle_reset con 0x555f1cb08c00 session 0x555f1aa603c0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69255168 unmapped: 17735680 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 731868 data_alloc: 218103808 data_used: 102400
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 75 ms_handle_reset con 0x555f1cb22c00 session 0x555f1aa330e0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69443584 unmapped: 17547264 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 75 ms_handle_reset con 0x555f1b858000 session 0x555f1ab53e00
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 75 ms_handle_reset con 0x555f1cb08800 session 0x555f1b28e960
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69443584 unmapped: 17547264 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 75 heartbeat osd_stat(store_statfs(0x4fb2b7000/0x0/0x4ffc00000, data 0x18e8982/0x1966000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 75 handle_osd_map epochs [76,76], i have 75, src has [1,76]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 76 handle_osd_map epochs [76,77], i have 76, src has [1,77]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 77 handle_osd_map epochs [77,77], i have 77, src has [1,77]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69484544 unmapped: 17506304 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 77 ms_handle_reset con 0x555f1cb08c00 session 0x555f1c782960
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 77 handle_osd_map epochs [78,78], i have 77, src has [1,78]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 78 ms_handle_reset con 0x555f1cb09000 session 0x555f1ab661e0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69492736 unmapped: 17498112 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 78 handle_osd_map epochs [78,79], i have 78, src has [1,79]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 79 heartbeat osd_stat(store_statfs(0x4fb2af000/0x0/0x4ffc00000, data 0x18eb9f2/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69541888 unmapped: 17448960 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 748246 data_alloc: 218103808 data_used: 102400
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69541888 unmapped: 17448960 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69541888 unmapped: 17448960 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb22800 session 0x555f1ab67860
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1b858000 session 0x555f1ab66d20
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab66000
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb08c00 session 0x555f1ab66b40
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb22400 session 0x555f1aa32b40
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb09000 session 0x555f1ab67c20
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1b858000 session 0x555f1c782000
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab66b40
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb09c00 session 0x555f1ab67860
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69591040 unmapped: 17399808 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb09800 session 0x555f1aa61c20
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 79 heartbeat osd_stat(store_statfs(0x4fb2aa000/0x0/0x4ffc00000, data 0x18ee51a/0x1974000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69591040 unmapped: 17399808 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 79 handle_osd_map epochs [80,80], i have 79, src has [1,80]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.622175217s of 10.975051880s, submitted: 122
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 80 ms_handle_reset con 0x555f1cb09400 session 0x555f1ab6e3c0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69910528 unmapped: 17080320 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 757501 data_alloc: 218103808 data_used: 114688
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 17072128 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 80 heartbeat osd_stat(store_statfs(0x4fb281000/0x0/0x4ffc00000, data 0x1913a15/0x199c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 17072128 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 80 heartbeat osd_stat(store_statfs(0x4fb281000/0x0/0x4ffc00000, data 0x1913a15/0x199c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 17072128 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 17072128 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 80 ms_handle_reset con 0x555f1cb09800 session 0x555f1abae3c0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69836800 unmapped: 17154048 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 80 handle_osd_map epochs [81,81], i have 80, src has [1,81]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 81 ms_handle_reset con 0x555f1cb09c00 session 0x555f1ab3e960
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 81 ms_handle_reset con 0x555f1cb08000 session 0x555f1aa652c0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 81 ms_handle_reset con 0x555f1cb08c00 session 0x555f1b4b92c0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 81 ms_handle_reset con 0x555f1cb23400 session 0x555f1aa60780
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 766180 data_alloc: 218103808 data_used: 135168
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69959680 unmapped: 17031168 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 81 handle_osd_map epochs [82,82], i have 81, src has [1,82]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 82 ms_handle_reset con 0x555f1cb08000 session 0x555f1ab6fc20
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 16973824 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 82 handle_osd_map epochs [83,83], i have 82, src has [1,83]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 83 ms_handle_reset con 0x555f1cb08c00 session 0x555f1ab66d20
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71081984 unmapped: 15908864 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 83 heartbeat osd_stat(store_statfs(0x4fb276000/0x0/0x4ffc00000, data 0x1917fc3/0x19a7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 83 ms_handle_reset con 0x555f1cb09800 session 0x555f1c782000
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 83 ms_handle_reset con 0x555f1cb09c00 session 0x555f1aa60780
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 83 ms_handle_reset con 0x555f1cb23c00 session 0x555f1aa652c0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 15892480 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 15892480 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 83 ms_handle_reset con 0x555f1cb08000 session 0x555f1ab67860
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.769915581s of 10.991518974s, submitted: 90
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 771276 data_alloc: 218103808 data_used: 139264
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 15884288 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 83 handle_osd_map epochs [84,84], i have 83, src has [1,84]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 84 ms_handle_reset con 0x555f1cb08c00 session 0x555f1b91a780
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 15859712 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 84 ms_handle_reset con 0x555f1b858000 session 0x555f1b28e960
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 84 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab6f860
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 84 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x19191dd/0x19a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 15826944 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72237056 unmapped: 14753792 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 84 handle_osd_map epochs [85,85], i have 84, src has [1,85]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72278016 unmapped: 14712832 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 85 handle_osd_map epochs [86,86], i have 85, src has [1,86]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 86 ms_handle_reset con 0x555f1cb09800 session 0x555f1b29cd20
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 776985 data_alloc: 218103808 data_used: 139264
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72302592 unmapped: 14688256 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72318976 unmapped: 14671872 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72318976 unmapped: 14671872 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 86 ms_handle_reset con 0x555f1b858c00 session 0x555f1ab53a40
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 86 ms_handle_reset con 0x555f1b8eb400 session 0x555f1abaf860
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 86 heartbeat osd_stat(store_statfs(0x4fb293000/0x0/0x4ffc00000, data 0x18f7c42/0x1988000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 86 ms_handle_reset con 0x555f1b858000 session 0x555f1b29d680
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72351744 unmapped: 14639104 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72351744 unmapped: 14639104 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 86 handle_osd_map epochs [87,87], i have 86, src has [1,87]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.348530769s of 10.076562881s, submitted: 127
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 87 ms_handle_reset con 0x555f1cb08000 session 0x555f1b4b9860
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 775824 data_alloc: 218103808 data_used: 143360
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72368128 unmapped: 14622720 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 87 handle_osd_map epochs [88,88], i have 87, src has [1,88]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 88 ms_handle_reset con 0x555f1cb08800 session 0x555f1c782960
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 88 heartbeat osd_stat(store_statfs(0x4fb2b5000/0x0/0x4ffc00000, data 0x18d66d9/0x1967000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 777730 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 88 heartbeat osd_stat(store_statfs(0x4fb2b5000/0x0/0x4ffc00000, data 0x18d66d9/0x1967000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 88 handle_osd_map epochs [89,89], i have 89, src has [1,89]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 89 heartbeat osd_stat(store_statfs(0x4fb2b3000/0x0/0x4ffc00000, data 0x18d7b95/0x196a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 779854 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.189030647s of 11.284521103s, submitted: 85
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b3000/0x0/0x4ffc00000, data 0x18d7b95/0x196a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72613888 unmapped: 14376960 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'config diff' '{prefix=config diff}'
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'config show' '{prefix=config show}'
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'counter dump' '{prefix=counter dump}'
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'counter schema' '{prefix=counter schema}'
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 14114816 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 14147584 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 13893632 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'log dump' '{prefix=log dump}'
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'perf dump' '{prefix=perf dump}'
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73039872 unmapped: 24993792 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'perf schema' '{prefix=perf schema}'
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.2 total, 600.0 interval#012Cumulative writes: 6021 writes, 24K keys, 6021 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 6021 writes, 1127 syncs, 5.34 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1678 writes, 4735 keys, 1678 commit groups, 1.0 writes per commit group, ingest: 2.54 MB, 0.00 MB/s#012Interval WAL: 1678 writes, 729 syncs, 2.30 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: mgrc ms_handle_reset ms_handle_reset con 0x555f1a4edc00
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3312476512
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3312476512,v1:192.168.122.100:6801/3312476512]
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: mgrc handle_mgr_configure stats_period=5
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 ms_handle_reset con 0x555f1b271000 session 0x555f19f23680
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 ms_handle_reset con 0x555f1b3fe400 session 0x555f1c7ac1e0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 ms_handle_reset con 0x555f1b271c00 session 0x555f1aabb0e0
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73105408 unmapped: 24928256 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73105408 unmapped: 24928256 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73105408 unmapped: 24928256 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 ms_handle_reset con 0x555f1b858400 session 0x555f1b29dc20
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec  1 04:48:05 np0005540741 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
